Apr 23 16:32:33.492139 ip-10-0-128-198 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 16:32:33.492152 ip-10-0-128-198 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 16:32:33.492162 ip-10-0-128-198 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 16:32:33.492494 ip-10-0-128-198 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 16:32:43.552844 ip-10-0-128-198 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 16:32:43.552862 ip-10-0-128-198 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 3e4111b0c6794f4382b17c01a3359c53 -- Apr 23 16:35:13.821583 ip-10-0-128-198 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:35:14.281623 ip-10-0-128-198 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:14.281623 ip-10-0-128-198 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:35:14.281623 ip-10-0-128-198 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:14.281623 ip-10-0-128-198 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:35:14.281623 ip-10-0-128-198 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:14.283017 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.282936 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:35:14.288794 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288772 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:14.288794 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288791 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:14.288794 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288795 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:14.288794 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288799 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288802 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288805 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288808 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288811 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288814 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288816 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288819 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288822 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288825 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288828 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288830 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288833 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288835 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288840 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288844 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288848 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288851 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288854 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:14.288962 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288857 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288866 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288869 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288872 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288874 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288877 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288879 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288881 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288884 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288886 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288889 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288891 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288894 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288896 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288899 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288902 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288905 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288908 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288910 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288913 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:14.289426 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288915 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288918 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288921 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288923 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288926 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288928 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288931 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288933 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288936 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288938 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288942 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288944 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288947 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288949 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288952 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288956 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288960 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288963 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288966 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:14.289942 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288969 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288971 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288974 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288977 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288980 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288982 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288985 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288988 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288991 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288994 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288996 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.288999 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289002 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289004 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289006 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289009 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289012 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289016 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289019 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289022 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:14.290412 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289025 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289027 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289030 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289033 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289036 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289441 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289447 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289451 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289454 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289457 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289459 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289462 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289465 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289468 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289471 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289473 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289476 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289479 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289481 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289485 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:14.290923 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289489 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289491 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289494 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289497 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289500 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289503 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289505 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289508 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289511 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289515 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289517 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289520 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289523 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289525 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289528 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289530 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289533 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289535 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289538 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:14.291431 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289540 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289543 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289545 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289548 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289550 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289553 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289555 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289558 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289560 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289563 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289565 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289568 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289571 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289575 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289577 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289580 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289583 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289585 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289588 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289590 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:14.291897 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289593 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289595 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289598 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289602 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289607 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289612 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289615 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289618 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289621 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289623 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289626 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289629 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289631 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289634 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289637 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289640 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289642 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289645 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289647 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:14.292409 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289650 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289653 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289655 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289658 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289661 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289663 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289666 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289668 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289671 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289673 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289676 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289679 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.289681 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291000 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291011 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291018 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291022 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291028 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291031 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291037 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291041 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:35:14.292884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291044 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291048 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291051 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291054 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291057 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291060 2580 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291064 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291066 2580 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291069 2580 flags.go:64] FLAG: --cloud-config="" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291072 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291075 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291080 2580 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291083 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291086 2580 flags.go:64] FLAG: --config-dir="" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291089 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291093 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291097 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291100 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291103 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291106 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291110 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291112 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291115 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291119 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291122 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:35:14.293412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291127 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291130 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291133 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291135 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291140 2580 flags.go:64] FLAG: --enable-server="true" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291143 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291147 2580 flags.go:64] FLAG: --event-burst="100" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291150 2580 flags.go:64] FLAG: --event-qps="50" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291153 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291156 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291159 2580 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291163 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291166 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291169 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291172 2580 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291175 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291178 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291181 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291184 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291187 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291190 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291193 2580 flags.go:64] FLAG: --feature-gates="" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291197 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291200 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291203 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:35:14.294034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291206 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291209 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291212 2580 flags.go:64] FLAG: --help="false" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291215 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291218 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291221 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291224 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291228 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291231 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291234 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291237 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291240 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291243 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291247 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291250 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291253 2580 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291256 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291258 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291261 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291264 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291267 2580 flags.go:64] FLAG: --lock-file="" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291270 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291273 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291276 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:35:14.294666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291281 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291284 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291305 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291308 2580 flags.go:64] FLAG: --logging-format="text" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291311 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291314 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291317 2580 flags.go:64] FLAG: --manifest-url="" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291320 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291325 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291329 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291333 2580 flags.go:64] FLAG: --max-pods="110" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291336 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291339 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291342 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291345 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291348 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291351 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291354 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291362 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291365 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291368 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291372 2580 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291375 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:35:14.295284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291380 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291383 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291386 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291390 2580 flags.go:64] FLAG: --port="10250" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291393 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291396 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f8bae4daadfeab6b" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291399 2580 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291402 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291405 2580 flags.go:64] FLAG: --register-node="true" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291408 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291411 2580 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291414 2580 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291418 2580 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291421 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291424 2580 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291428 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291431 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291434 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291437 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291440 2580 flags.go:64] FLAG: --runonce="false" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291443 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291446 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291449 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291452 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291455 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291458 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:35:14.295889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291461 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291464 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291467 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291470 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291473 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291476 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291479 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291482 2580 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291485 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291491 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291493 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291496 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291501 2580 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291503 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291506 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291509 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291512 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291515 2580 flags.go:64] FLAG: --v="2" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291521 2580 flags.go:64] FLAG: --version="false" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291525 2580 flags.go:64] FLAG: --vmodule="" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291530 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.291533 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291626 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291631 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:14.296524 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291634 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291637 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291640 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291643 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291645 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291648 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291652 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291654 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291657 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291659 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291662 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291665 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291668 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291671 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291674 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291677 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291680 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291683 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291685 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291688 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:14.297103 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291690 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291693 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291695 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291698 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291701 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291704 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291708 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291710 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291713 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291715 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291718 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291720 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291723 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291725 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291728 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291730 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291733 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291735 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291739 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:14.297625 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291742 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291745 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291747 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291750 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291753 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291756 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291758 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291760 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291764 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291766 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291769 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291771 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291774 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291776 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291779 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291782 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291784 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291787 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291789 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291793 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:14.298153 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291795 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291798 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291800 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291803 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291806 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291808 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291811 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291813 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291815 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291818 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291821 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291824 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291827 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291830 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291832 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291835 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291838 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291840 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291842 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291845 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:14.298675 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291847 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:14.299199 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291852 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:14.299199 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291855 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:14.299199 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291858 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:14.299199 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.291861 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:14.299199 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.292793 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:14.299199 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.299165 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:35:14.299199 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.299182 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299228 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299234 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299237 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299241 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299243 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299246 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299249 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299252 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299255 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299257 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299260 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299262 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299265 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299268 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299271 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299274 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299276 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299279 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299281 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:14.299406 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299284 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299287 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299308 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299311 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299313 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299316 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299319 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299321 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299324 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299326 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299329 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299332 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299335 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299338 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299343 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299348 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299351 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299354 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299357 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:14.299892 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299360 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299363 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299365 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299368 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299371 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299374 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299376 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299379 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299382 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299385 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299387 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299390 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299392 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299395 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299397 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299399 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299402 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299404 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299407 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:14.300379 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299410 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299412 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299415 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299418 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299420 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299423 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299425 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299429 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299433 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299436 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299439 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299441 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299443 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299446 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299449 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299451 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299454 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299457 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299461 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299464 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:14.300841 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299467 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299470 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299473 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299475 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299478 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299480 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299483 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299485 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299488 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.299493 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299591 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299596 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299599 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299602 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299605 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299608 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:14.301339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299611 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299613 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299616 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299618 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299625 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299628 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299631 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299633 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299635 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299638 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299640 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299643 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299645 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299648 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299651 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299653 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299656 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299658 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299661 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299663 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:14.301734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299666 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299669 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299671 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299674 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299676 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299679 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299681 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299684 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299687 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299690 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299694 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299698 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299701 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299706 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299709 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299712 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299714 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299717 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299720 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:14.302221 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299722 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299725 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299727 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299729 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299732 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299734 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299737 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299739 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299742 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299744 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299747 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299749 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299752 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299754 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299756 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299759 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299761 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299764 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299766 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299768 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:14.302695 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299771 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299774 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299776 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299779 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299781 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299784 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299787 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299789 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299792 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299794 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299796 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299799 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299802 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299804 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299807 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299809 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299812 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299814 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299817 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299819 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:14.303185 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:14.299822 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:14.303757 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.299827 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:14.303757 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.300574 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:35:14.304770 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.304756 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:35:14.305629 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.305619 2580 server.go:1019] "Starting client certificate rotation" Apr 23 16:35:14.305731 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.305715 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:14.305767 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.305754 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:14.332617 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.332600 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:14.340570 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.340546 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:14.357671 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.357651 2580 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:35:14.364344 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.364326 2580 log.go:25] "Validated CRI v1 image API" Apr 23 16:35:14.364810 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.364793 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:14.365619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.365607 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:35:14.369609 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.369584 2580 fs.go:135] Filesystem UUIDs: map[3a850c82-93c3-4d56-bf17-98ce67ce792e:/dev/nvme0n1p3 502208fd-40f7-4aee-9308-05066c934c53:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 23 16:35:14.369668 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.369610 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:35:14.375265 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.375151 2580 manager.go:217] Machine: {Timestamp:2026-04-23 16:35:14.373417975 +0000 UTC m=+0.434492219 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3092771 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2764fe497022c133145b2c80fc7aa1 SystemUUID:ec2764fe-4970-22c1-3314-5b2c80fc7aa1 BootID:3e4111b0-c679-4f43-82b1-7c01a3359c53 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a0:a2:77:a3:a3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a0:a2:77:a3:a3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:8c:e0:aa:dd:25 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:35:14.375265 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.375260 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:35:14.375409 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.375397 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:35:14.377604 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.377580 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:35:14.377752 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.377607 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-198.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:35:14.377796 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.377761 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:35:14.377796 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.377770 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:35:14.377796 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.377783 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:14.378452 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.378441 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:14.379703 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.379693 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:14.379807 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.379799 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:35:14.382299 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.382282 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:35:14.382368 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.382360 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:35:14.382412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.382375 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:35:14.382412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.382384 2580 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:35:14.382412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.382393 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:35:14.383476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.383464 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:14.383513 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.383482 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:14.386106 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.386090 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:35:14.387956 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.387943 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:35:14.389430 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389419 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:35:14.389476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389436 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:35:14.389476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389442 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:35:14.389476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389451 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:35:14.389476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389460 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:35:14.389476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389466 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:35:14.389476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389472 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:35:14.389476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389477 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:35:14.389648 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389484 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:35:14.389648 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389490 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:35:14.389648 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389498 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:35:14.389648 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.389507 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:35:14.390178 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.390168 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:35:14.390208 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.390180 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:35:14.394563 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.394547 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:35:14.394635 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.394589 2580 server.go:1295] "Started kubelet" Apr 23 16:35:14.394747 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.394706 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:35:14.394985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.394940 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:35:14.395029 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.395005 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:35:14.395534 ip-10-0-128-198 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:35:14.399638 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.399504 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:35:14.399781 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.399638 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-198.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:35:14.399781 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.399709 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-198.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 16:35:14.399934 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.399833 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:35:14.400756 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.400738 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:35:14.402998 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.402247 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-198.ec2.internal.18a909a0d7d0991c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-198.ec2.internal,UID:ip-10-0-128-198.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-198.ec2.internal,},FirstTimestamp:2026-04-23 16:35:14.394560796 +0000 UTC m=+0.455635043,LastTimestamp:2026-04-23 16:35:14.394560796 +0000 UTC m=+0.455635043,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-198.ec2.internal,}" Apr 23 16:35:14.405159 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.405144 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:35:14.405235 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.405157 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:14.405916 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.405894 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:35:14.405916 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.405897 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:35:14.406030 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.405931 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:35:14.406082 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.406050 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:35:14.406082 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.406062 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:35:14.406178 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.406134 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:14.406448 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.406424 2580 factory.go:55] Registering systemd factory Apr 23 16:35:14.406448 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.406448 2580 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:35:14.406690 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.406675 2580 factory.go:153] Registering CRI-O factory Apr 23 16:35:14.406690 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.406690 2580 factory.go:223] Registration of the crio container factory successfully Apr 23 16:35:14.406827 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.406777 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:35:14.406827 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.406802 2580 factory.go:103] Registering Raw factory Apr 23 16:35:14.406827 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.406819 2580 manager.go:1196] Started watching for new ooms in manager Apr 23 16:35:14.407182 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.407167 2580 manager.go:319] Starting recovery of all containers Apr 23 16:35:14.407344 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.407317 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 16:35:14.416149 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.416126 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-198.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 16:35:14.416365 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.416351 2580 manager.go:324] Recovery completed Apr 23 16:35:14.416487 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.416467 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 16:35:14.420362 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.420349 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.422647 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.422631 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.422716 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.422661 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.422716 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.422672 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.423166 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.423150 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:35:14.423166 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.423165 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:35:14.423249 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.423180 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:14.424367 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.424308 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-198.ec2.internal.18a909a0d97d3019 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-198.ec2.internal,UID:ip-10-0-128-198.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-198.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-198.ec2.internal,},FirstTimestamp:2026-04-23 16:35:14.422648857 +0000 UTC m=+0.483723101,LastTimestamp:2026-04-23 16:35:14.422648857 +0000 UTC m=+0.483723101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-198.ec2.internal,}" Apr 23 16:35:14.425441 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.425427 2580 policy_none.go:49] "None policy: Start" Apr 23 16:35:14.425493 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.425446 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:35:14.425493 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.425459 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:35:14.434022 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.433961 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-198.ec2.internal.18a909a0d97d7441 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-198.ec2.internal,UID:ip-10-0-128-198.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-128-198.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-128-198.ec2.internal,},FirstTimestamp:2026-04-23 16:35:14.422666305 +0000 UTC m=+0.483740550,LastTimestamp:2026-04-23 16:35:14.422666305 +0000 UTC m=+0.483740550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-198.ec2.internal,}" Apr 23 16:35:14.436477 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.436461 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w48mw" Apr 23 16:35:14.442501 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.442484 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w48mw" Apr 23 16:35:14.444238 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.444177 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-198.ec2.internal.18a909a0d97d994a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-198.ec2.internal,UID:ip-10-0-128-198.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-128-198.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-128-198.ec2.internal,},FirstTimestamp:2026-04-23 16:35:14.422675786 +0000 UTC m=+0.483750034,LastTimestamp:2026-04-23 16:35:14.422675786 +0000 UTC m=+0.483750034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-198.ec2.internal,}" Apr 23 16:35:14.473751 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.463867 2580 manager.go:341] "Starting Device Plugin manager" Apr 23 16:35:14.473751 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.463943 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:35:14.473751 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.463957 2580 server.go:85] "Starting device plugin registration server" Apr 23 16:35:14.473751 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.464172 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:35:14.473751 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.464182 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:35:14.473751 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.464264 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:35:14.473751 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.464354 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:35:14.473751 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.464361 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:35:14.473751 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.464839 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:35:14.473751 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.464881 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:14.530901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.530865 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:35:14.532018 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.532004 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:35:14.532096 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.532029 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:35:14.532096 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.532049 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:35:14.532096 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.532055 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:35:14.532096 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.532086 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:35:14.535074 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.535048 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:14.564746 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.564732 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.568101 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.568086 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.568174 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.568113 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.568174 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.568123 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.568174 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.568145 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.578983 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.578967 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.579031 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.578991 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-198.ec2.internal\": node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:14.592866 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.592847 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:14.632913 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.632889 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-198.ec2.internal"] Apr 23 16:35:14.632989 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.632973 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.633854 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.633832 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.633931 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.633871 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.633931 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.633888 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.635644 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.635632 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.635781 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.635766 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.635816 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.635801 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.636306 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.636274 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.636396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.636312 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.636396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.636322 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.636396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.636275 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.636396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.636352 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.636396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.636364 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.637542 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.637525 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.637588 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.637558 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.638181 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.638162 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.638264 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.638185 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.638264 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.638198 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.663839 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.663821 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-198.ec2.internal\" not found" node="ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.668145 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.668131 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-198.ec2.internal\" not found" node="ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.693042 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.693023 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:14.708034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.708013 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fd0ef97a99ad7f8bb0953574fa7327d0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal\" (UID: \"fd0ef97a99ad7f8bb0953574fa7327d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.708091 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.708039 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd0ef97a99ad7f8bb0953574fa7327d0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal\" (UID: \"fd0ef97a99ad7f8bb0953574fa7327d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.708091 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.708057 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49434315036770a524de2f8664b84004-config\") pod \"kube-apiserver-proxy-ip-10-0-128-198.ec2.internal\" (UID: \"49434315036770a524de2f8664b84004\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.793285 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.793204 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:14.808612 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.808586 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fd0ef97a99ad7f8bb0953574fa7327d0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal\" (UID: \"fd0ef97a99ad7f8bb0953574fa7327d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.808692 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.808617 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd0ef97a99ad7f8bb0953574fa7327d0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal\" (UID: \"fd0ef97a99ad7f8bb0953574fa7327d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.808692 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.808634 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49434315036770a524de2f8664b84004-config\") pod \"kube-apiserver-proxy-ip-10-0-128-198.ec2.internal\" (UID: \"49434315036770a524de2f8664b84004\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.808692 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.808686 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fd0ef97a99ad7f8bb0953574fa7327d0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal\" (UID: \"fd0ef97a99ad7f8bb0953574fa7327d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.808796 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.808753 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49434315036770a524de2f8664b84004-config\") pod \"kube-apiserver-proxy-ip-10-0-128-198.ec2.internal\" (UID: \"49434315036770a524de2f8664b84004\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.808796 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.808792 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd0ef97a99ad7f8bb0953574fa7327d0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal\" (UID: \"fd0ef97a99ad7f8bb0953574fa7327d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.893976 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.893931 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:14.966392 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.966366 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.969986 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:14.969966 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-198.ec2.internal" Apr 23 16:35:14.994850 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:14.994826 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:15.095429 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:15.095336 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:15.195919 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:15.195886 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:15.296496 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:15.296464 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:15.305662 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.305641 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:35:15.305783 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.305769 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:15.397457 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:15.397397 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:15.405972 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.405952 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:15.435350 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.435329 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:15.444934 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.444900 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:30:14 +0000 UTC" deadline="2027-10-18 07:30:14.222318619 +0000 UTC" Apr 23 16:35:15.444934 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.444933 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13022h54m58.7773899s" Apr 23 16:35:15.498518 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:15.498476 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:15.507448 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.507427 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vqns7" Apr 23 16:35:15.520565 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:15.520534 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd0ef97a99ad7f8bb0953574fa7327d0.slice/crio-920c477a8e94768eacc4c6c5e4e34696472b90c6b3b1a63ee6e90f4ba00e49ce WatchSource:0}: Error finding container 920c477a8e94768eacc4c6c5e4e34696472b90c6b3b1a63ee6e90f4ba00e49ce: Status 404 returned error can't find the container with id 920c477a8e94768eacc4c6c5e4e34696472b90c6b3b1a63ee6e90f4ba00e49ce Apr 23 16:35:15.520760 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:15.520741 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49434315036770a524de2f8664b84004.slice/crio-49aa21798918bef98dfb9229d73c1571ccb9886bf70f66e40b32896b15682916 WatchSource:0}: Error finding container 49aa21798918bef98dfb9229d73c1571ccb9886bf70f66e40b32896b15682916: Status 404 returned error can't find the container with id 49aa21798918bef98dfb9229d73c1571ccb9886bf70f66e40b32896b15682916 Apr 23 16:35:15.524463 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.524438 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vqns7" Apr 23 16:35:15.526653 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.526641 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:35:15.535130 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.535091 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" event={"ID":"fd0ef97a99ad7f8bb0953574fa7327d0","Type":"ContainerStarted","Data":"920c477a8e94768eacc4c6c5e4e34696472b90c6b3b1a63ee6e90f4ba00e49ce"} Apr 23 16:35:15.536066 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.536047 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-198.ec2.internal" event={"ID":"49434315036770a524de2f8664b84004","Type":"ContainerStarted","Data":"49aa21798918bef98dfb9229d73c1571ccb9886bf70f66e40b32896b15682916"} Apr 23 16:35:15.573088 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.573064 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:15.599347 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:15.599326 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-198.ec2.internal\" not found" Apr 23 16:35:15.689635 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.689557 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:15.706322 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.706284 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" Apr 23 16:35:15.730027 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.730004 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:15.731629 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.731612 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-198.ec2.internal" Apr 23 16:35:15.733464 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.733450 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:15.745908 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:15.745888 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:16.383560 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.383522 2580 apiserver.go:52] "Watching apiserver" Apr 23 16:35:16.394067 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.394038 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:35:16.394439 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.394414 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm","openshift-dns/node-resolver-w5pdt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal","openshift-multus/network-metrics-daemon-5ps7g","openshift-network-operator/iptables-alerter-rflcs","kube-system/global-pull-secret-syncer-tr482","kube-system/konnectivity-agent-q8rd7","openshift-cluster-node-tuning-operator/tuned-9qvrt","openshift-image-registry/node-ca-z94bz","openshift-multus/multus-additional-cni-plugins-wqfht","openshift-multus/multus-xs2mg","openshift-network-diagnostics/network-check-target-kh9hh","openshift-ovn-kubernetes/ovnkube-node-nfkqz","kube-system/kube-apiserver-proxy-ip-10-0-128-198.ec2.internal"] Apr 23 16:35:16.395860 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.395836 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:16.397065 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.397040 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.398172 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.398144 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.398558 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.398537 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:35:16.398828 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.398671 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:35:16.398828 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.398764 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7vv6n\"" Apr 23 16:35:16.399453 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.399434 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:35:16.399559 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.399543 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:16.399654 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:16.399631 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:16.400669 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.400649 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:35:16.400669 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.400661 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rflcs" Apr 23 16:35:16.401013 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.400995 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.401663 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.401646 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t6kth\"" Apr 23 16:35:16.402194 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.401881 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.402194 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.401986 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.402194 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.402020 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zqfld\"" Apr 23 16:35:16.402194 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.402145 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:35:16.402561 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.402349 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.402561 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.402403 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:35:16.403358 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.403127 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.403358 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.403139 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:35:16.403358 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.403221 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9kgt9\"" Apr 23 16:35:16.403358 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.403346 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:16.403600 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:16.403399 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:16.403600 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.403412 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.404658 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.404640 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z94bz" Apr 23 16:35:16.404749 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.404723 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.406317 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.406049 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5pdt" Apr 23 16:35:16.407765 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.407736 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:35:16.407765 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.407749 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.407947 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.407913 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.408032 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.407974 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.408093 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.408053 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.408093 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.407976 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2tgsr\"" Apr 23 16:35:16.408184 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.407921 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bjgcj\"" Apr 23 16:35:16.409089 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.408775 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zwkbr\"" Apr 23 16:35:16.409089 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.408797 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.409267 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.409183 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:35:16.409267 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.409211 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.409267 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.409225 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.410608 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.410588 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:16.410690 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.410635 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.411528 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:16.410869 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:16.411829 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.411806 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:35:16.411928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.411833 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:35:16.412110 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.412092 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.412190 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.412171 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:35:16.412609 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.412592 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l87hb\"" Apr 23 16:35:16.413320 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.413285 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.413510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.413491 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:35:16.413575 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.413556 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mz2r4\"" Apr 23 16:35:16.416425 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416397 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-var-lib-openvswitch\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.416576 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416432 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-var-lib-kubelet\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.416576 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416457 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/268f3349-6678-43d5-8596-698c807f908a-agent-certs\") pod \"konnectivity-agent-q8rd7\" (UID: \"268f3349-6678-43d5-8596-698c807f908a\") " pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:16.416576 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416481 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:16.416576 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416530 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv7gb\" (UniqueName: \"kubernetes.io/projected/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-kube-api-access-nv7gb\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:16.416823 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416588 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-registration-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.416823 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416629 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-system-cni-dir\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.416823 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416658 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5949893b-cd3d-46d5-b194-4ef1ad542b81-ovnkube-config\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.416823 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416682 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jhh9\" (UniqueName: \"kubernetes.io/projected/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-kube-api-access-7jhh9\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.416823 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416708 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv9fl\" (UniqueName: \"kubernetes.io/projected/ee57872d-b83c-49ea-b226-5322cb6d1db3-kube-api-access-gv9fl\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.416823 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416733 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wpfp\" (UniqueName: \"kubernetes.io/projected/b153d070-3e87-4322-a47e-5cefb6aded60-kube-api-access-6wpfp\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.416823 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416759 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.416823 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416779 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-sysconfig\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.416823 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416793 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/398390dc-662b-42ee-b57e-22175922f0ac-host-slash\") pod \"iptables-alerter-rflcs\" (UID: \"398390dc-662b-42ee-b57e-22175922f0ac\") " pod="openshift-network-operator/iptables-alerter-rflcs" Apr 23 16:35:16.416823 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416816 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25hbm\" (UniqueName: \"kubernetes.io/projected/398390dc-662b-42ee-b57e-22175922f0ac-kube-api-access-25hbm\") pod \"iptables-alerter-rflcs\" (UID: \"398390dc-662b-42ee-b57e-22175922f0ac\") " pod="openshift-network-operator/iptables-alerter-rflcs" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416839 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5949893b-cd3d-46d5-b194-4ef1ad542b81-env-overrides\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416862 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-etc-selinux\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416886 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-tuned\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416918 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5949893b-cd3d-46d5-b194-4ef1ad542b81-ovnkube-script-lib\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416941 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4-host\") pod \"node-ca-z94bz\" (UID: \"8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4\") " pod="openshift-image-registry/node-ca-z94bz" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.416966 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-sysctl-conf\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417031 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-sys\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417074 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-lib-modules\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417103 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-device-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417133 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-sys-fs\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417159 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgmfm\" (UniqueName: \"kubernetes.io/projected/8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4-kube-api-access-kgmfm\") pod \"node-ca-z94bz\" (UID: \"8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4\") " pod="openshift-image-registry/node-ca-z94bz" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417214 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417249 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-systemd\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.417286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417273 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-cnibin\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417315 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-os-release\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417349 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/268f3349-6678-43d5-8596-698c807f908a-konnectivity-ca\") pod \"konnectivity-agent-q8rd7\" (UID: \"268f3349-6678-43d5-8596-698c807f908a\") " pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417402 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-systemd-units\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417455 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-kubelet\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417479 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-slash\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417502 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-run-systemd\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417772 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-modprobe-d\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417801 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-socket-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417824 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4-serviceca\") pod \"node-ca-z94bz\" (UID: \"8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4\") " pod="openshift-image-registry/node-ca-z94bz" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417873 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5da0035a-7e6e-4e50-9404-1dde996e4313-hosts-file\") pod \"node-resolver-w5pdt\" (UID: \"5da0035a-7e6e-4e50-9404-1dde996e4313\") " pod="openshift-dns/node-resolver-w5pdt" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417901 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ee57872d-b83c-49ea-b226-5322cb6d1db3-cni-binary-copy\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417915 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-run-ovn\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417935 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-node-log\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417960 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-cni-netd\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.417985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.417992 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-log-socket\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418019 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-run-openvswitch\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418034 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-run\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418047 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-host\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418061 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5da0035a-7e6e-4e50-9404-1dde996e4313-tmp-dir\") pod \"node-resolver-w5pdt\" (UID: \"5da0035a-7e6e-4e50-9404-1dde996e4313\") " pod="openshift-dns/node-resolver-w5pdt" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418077 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ee57872d-b83c-49ea-b226-5322cb6d1db3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418091 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-run-ovn-kubernetes\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418105 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-cni-bin\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418119 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-kubernetes\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418135 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ee57872d-b83c-49ea-b226-5322cb6d1db3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418151 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-etc-openvswitch\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418165 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5949893b-cd3d-46d5-b194-4ef1ad542b81-ovn-node-metrics-cert\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418179 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-tmp\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418193 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/398390dc-662b-42ee-b57e-22175922f0ac-iptables-alerter-script\") pod \"iptables-alerter-rflcs\" (UID: \"398390dc-662b-42ee-b57e-22175922f0ac\") " pod="openshift-network-operator/iptables-alerter-rflcs" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418207 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-run-netns\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418221 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpx5b\" (UniqueName: \"kubernetes.io/projected/5949893b-cd3d-46d5-b194-4ef1ad542b81-kube-api-access-dpx5b\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.418517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418235 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.419063 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418248 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-sysctl-d\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.419063 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418262 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjms6\" (UniqueName: \"kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6\") pod \"network-check-target-kh9hh\" (UID: \"3c0ebdcb-a7e8-4e29-a486-52aed308cf33\") " pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:16.419063 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.418275 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4tch\" (UniqueName: \"kubernetes.io/projected/5da0035a-7e6e-4e50-9404-1dde996e4313-kube-api-access-v4tch\") pod \"node-resolver-w5pdt\" (UID: \"5da0035a-7e6e-4e50-9404-1dde996e4313\") " pod="openshift-dns/node-resolver-w5pdt" Apr 23 16:35:16.506871 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.506840 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:35:16.518534 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518509 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-systemd-units\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.518534 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518538 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-kubelet\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.518766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518560 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-slash\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.518766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518579 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-run-systemd\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.518766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518617 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-modprobe-d\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.518766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518638 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-systemd-units\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.518766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518650 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-slash\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.518766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518654 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-socket-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.518766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518693 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-kubelet\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.518766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518632 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-run-systemd\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.518766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518705 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4-serviceca\") pod \"node-ca-z94bz\" (UID: \"8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4\") " pod="openshift-image-registry/node-ca-z94bz" Apr 23 16:35:16.518766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518764 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5da0035a-7e6e-4e50-9404-1dde996e4313-hosts-file\") pod \"node-resolver-w5pdt\" (UID: \"5da0035a-7e6e-4e50-9404-1dde996e4313\") " pod="openshift-dns/node-resolver-w5pdt" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518776 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-socket-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518782 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-modprobe-d\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ee57872d-b83c-49ea-b226-5322cb6d1db3-cni-binary-copy\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5da0035a-7e6e-4e50-9404-1dde996e4313-hosts-file\") pod \"node-resolver-w5pdt\" (UID: \"5da0035a-7e6e-4e50-9404-1dde996e4313\") " pod="openshift-dns/node-resolver-w5pdt" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518866 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-run-ovn\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518920 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-run-ovn\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518968 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-node-log\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.518996 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-cni-netd\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519021 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-log-socket\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519054 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-multus-socket-dir-parent\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519055 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-node-log\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519084 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-cni-netd\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519096 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b70dcd33-861c-4a46-8752-421c750db1ff-multus-daemon-config\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519105 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-log-socket\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519127 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-run-multus-certs\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519158 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-run-openvswitch\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519215 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4-serviceca\") pod \"node-ca-z94bz\" (UID: \"8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4\") " pod="openshift-image-registry/node-ca-z94bz" Apr 23 16:35:16.519241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519226 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/047666b9-5e8b-4117-8317-ca917bf89757-dbus\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519278 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-run-openvswitch\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519281 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-run\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519325 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-host\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519341 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5da0035a-7e6e-4e50-9404-1dde996e4313-tmp-dir\") pod \"node-resolver-w5pdt\" (UID: \"5da0035a-7e6e-4e50-9404-1dde996e4313\") " pod="openshift-dns/node-resolver-w5pdt" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519358 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ee57872d-b83c-49ea-b226-5322cb6d1db3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519366 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-run\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519381 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-run-ovn-kubernetes\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519403 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ee57872d-b83c-49ea-b226-5322cb6d1db3-cni-binary-copy\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519411 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-host\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519406 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-cni-bin\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519447 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-run-k8s-cni-cncf-io\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519455 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-run-ovn-kubernetes\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519456 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-cni-bin\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519474 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgjb8\" (UniqueName: \"kubernetes.io/projected/b70dcd33-861c-4a46-8752-421c750db1ff-kube-api-access-rgjb8\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519521 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-kubernetes\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519547 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ee57872d-b83c-49ea-b226-5322cb6d1db3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.520037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519572 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-etc-openvswitch\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519600 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5949893b-cd3d-46d5-b194-4ef1ad542b81-ovn-node-metrics-cert\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519623 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-kubernetes\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519646 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-var-lib-cni-multus\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519662 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-etc-openvswitch\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519682 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-tmp\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519709 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/398390dc-662b-42ee-b57e-22175922f0ac-iptables-alerter-script\") pod \"iptables-alerter-rflcs\" (UID: \"398390dc-662b-42ee-b57e-22175922f0ac\") " pod="openshift-network-operator/iptables-alerter-rflcs" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519732 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5da0035a-7e6e-4e50-9404-1dde996e4313-tmp-dir\") pod \"node-resolver-w5pdt\" (UID: \"5da0035a-7e6e-4e50-9404-1dde996e4313\") " pod="openshift-dns/node-resolver-w5pdt" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519782 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ee57872d-b83c-49ea-b226-5322cb6d1db3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519735 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-run-netns\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519815 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-run-netns\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpx5b\" (UniqueName: \"kubernetes.io/projected/5949893b-cd3d-46d5-b194-4ef1ad542b81-kube-api-access-dpx5b\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519854 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-multus-cni-dir\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519987 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519995 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520211 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/398390dc-662b-42ee-b57e-22175922f0ac-iptables-alerter-script\") pod \"iptables-alerter-rflcs\" (UID: \"398390dc-662b-42ee-b57e-22175922f0ac\") " pod="openshift-network-operator/iptables-alerter-rflcs" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.519880 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.521510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520319 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-sysctl-d\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520345 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjms6\" (UniqueName: \"kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6\") pod \"network-check-target-kh9hh\" (UID: \"3c0ebdcb-a7e8-4e29-a486-52aed308cf33\") " pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520371 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4tch\" (UniqueName: \"kubernetes.io/projected/5da0035a-7e6e-4e50-9404-1dde996e4313-kube-api-access-v4tch\") pod \"node-resolver-w5pdt\" (UID: \"5da0035a-7e6e-4e50-9404-1dde996e4313\") " pod="openshift-dns/node-resolver-w5pdt" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520397 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-var-lib-openvswitch\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520420 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-hostroot\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520446 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-var-lib-kubelet\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520453 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-sysctl-d\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520476 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/268f3349-6678-43d5-8596-698c807f908a-agent-certs\") pod \"konnectivity-agent-q8rd7\" (UID: \"268f3349-6678-43d5-8596-698c807f908a\") " pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520512 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-run-netns\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520523 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-var-lib-openvswitch\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520534 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-multus-conf-dir\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520560 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520587 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv7gb\" (UniqueName: \"kubernetes.io/projected/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-kube-api-access-nv7gb\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ee57872d-b83c-49ea-b226-5322cb6d1db3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520601 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-var-lib-kubelet\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520612 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-registration-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520635 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-system-cni-dir\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.522307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520654 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5949893b-cd3d-46d5-b194-4ef1ad542b81-ovnkube-config\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jhh9\" (UniqueName: \"kubernetes.io/projected/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-kube-api-access-7jhh9\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:16.520680 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520710 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-system-cni-dir\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520716 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv9fl\" (UniqueName: \"kubernetes.io/projected/ee57872d-b83c-49ea-b226-5322cb6d1db3-kube-api-access-gv9fl\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:16.520745 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs podName:4033b659-eaae-4ad3-a8a3-523bdf5fcf89 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:17.020723999 +0000 UTC m=+3.081798238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs") pod "network-metrics-daemon-5ps7g" (UID: "4033b659-eaae-4ad3-a8a3-523bdf5fcf89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520747 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-registration-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520777 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-os-release\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520808 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520836 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wpfp\" (UniqueName: \"kubernetes.io/projected/b153d070-3e87-4322-a47e-5cefb6aded60-kube-api-access-6wpfp\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520866 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520894 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-sysconfig\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520919 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/398390dc-662b-42ee-b57e-22175922f0ac-host-slash\") pod \"iptables-alerter-rflcs\" (UID: \"398390dc-662b-42ee-b57e-22175922f0ac\") " pod="openshift-network-operator/iptables-alerter-rflcs" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520944 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25hbm\" (UniqueName: \"kubernetes.io/projected/398390dc-662b-42ee-b57e-22175922f0ac-kube-api-access-25hbm\") pod \"iptables-alerter-rflcs\" (UID: \"398390dc-662b-42ee-b57e-22175922f0ac\") " pod="openshift-network-operator/iptables-alerter-rflcs" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520967 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5949893b-cd3d-46d5-b194-4ef1ad542b81-env-overrides\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.520995 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-system-cni-dir\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.523080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521020 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-var-lib-cni-bin\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521020 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/398390dc-662b-42ee-b57e-22175922f0ac-host-slash\") pod \"iptables-alerter-rflcs\" (UID: \"398390dc-662b-42ee-b57e-22175922f0ac\") " pod="openshift-network-operator/iptables-alerter-rflcs" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521045 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-etc-kubernetes\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521067 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5949893b-cd3d-46d5-b194-4ef1ad542b81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-etc-selinux\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521100 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-tuned\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521111 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-sysconfig\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521127 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5949893b-cd3d-46d5-b194-4ef1ad542b81-ovnkube-script-lib\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521153 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-cnibin\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521177 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-var-lib-kubelet\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521202 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/047666b9-5e8b-4117-8317-ca917bf89757-kubelet-config\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521227 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4-host\") pod \"node-ca-z94bz\" (UID: \"8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4\") " pod="openshift-image-registry/node-ca-z94bz" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521253 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b70dcd33-861c-4a46-8752-421c750db1ff-cni-binary-copy\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521365 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4-host\") pod \"node-ca-z94bz\" (UID: \"8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4\") " pod="openshift-image-registry/node-ca-z94bz" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521374 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-etc-selinux\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521425 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-sysctl-conf\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521451 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-sys\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.523845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521475 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-lib-modules\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521504 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-device-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521509 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5949893b-cd3d-46d5-b194-4ef1ad542b81-env-overrides\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521528 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-sys-fs\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521556 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgmfm\" (UniqueName: \"kubernetes.io/projected/8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4-kube-api-access-kgmfm\") pod \"node-ca-z94bz\" (UID: \"8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4\") " pod="openshift-image-registry/node-ca-z94bz" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521566 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-sys\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521582 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521606 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-systemd\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521626 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-cnibin\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521647 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-os-release\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521671 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-sysctl-conf\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521671 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/268f3349-6678-43d5-8596-698c807f908a-konnectivity-ca\") pod \"konnectivity-agent-q8rd7\" (UID: \"268f3349-6678-43d5-8596-698c807f908a\") " pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521750 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5949893b-cd3d-46d5-b194-4ef1ad542b81-ovnkube-script-lib\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521822 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-device-dir\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521923 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5949893b-cd3d-46d5-b194-4ef1ad542b81-ovnkube-config\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521954 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b153d070-3e87-4322-a47e-5cefb6aded60-sys-fs\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521963 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.524621 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.521824 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-systemd\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.525163 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.522010 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-lib-modules\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.525163 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.522016 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-cnibin\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.525163 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.522036 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ee57872d-b83c-49ea-b226-5322cb6d1db3-os-release\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.525163 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.522159 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/268f3349-6678-43d5-8596-698c807f908a-konnectivity-ca\") pod \"konnectivity-agent-q8rd7\" (UID: \"268f3349-6678-43d5-8596-698c807f908a\") " pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:16.525163 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.523665 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5949893b-cd3d-46d5-b194-4ef1ad542b81-ovn-node-metrics-cert\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.525163 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.523675 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-tmp\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.525163 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.523914 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/268f3349-6678-43d5-8596-698c807f908a-agent-certs\") pod \"konnectivity-agent-q8rd7\" (UID: \"268f3349-6678-43d5-8596-698c807f908a\") " pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:16.525163 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.524528 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-etc-tuned\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.525512 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.525486 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:15 +0000 UTC" deadline="2028-01-30 09:58:28.711960884 +0000 UTC" Apr 23 16:35:16.525561 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.525513 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15521h23m12.186451345s" Apr 23 16:35:16.529231 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:16.529211 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:16.529347 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:16.529236 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:16.529347 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:16.529249 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pjms6 for pod openshift-network-diagnostics/network-check-target-kh9hh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:16.529347 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:16.529320 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6 podName:3c0ebdcb-a7e8-4e29-a486-52aed308cf33 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:17.029303997 +0000 UTC m=+3.090378230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pjms6" (UniqueName: "kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6") pod "network-check-target-kh9hh" (UID: "3c0ebdcb-a7e8-4e29-a486-52aed308cf33") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:16.531457 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.531439 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4tch\" (UniqueName: \"kubernetes.io/projected/5da0035a-7e6e-4e50-9404-1dde996e4313-kube-api-access-v4tch\") pod \"node-resolver-w5pdt\" (UID: \"5da0035a-7e6e-4e50-9404-1dde996e4313\") " pod="openshift-dns/node-resolver-w5pdt" Apr 23 16:35:16.531560 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.531521 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv7gb\" (UniqueName: \"kubernetes.io/projected/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-kube-api-access-nv7gb\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:16.539549 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.539526 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgmfm\" (UniqueName: \"kubernetes.io/projected/8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4-kube-api-access-kgmfm\") pod \"node-ca-z94bz\" (UID: \"8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4\") " pod="openshift-image-registry/node-ca-z94bz" Apr 23 16:35:16.539647 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.539523 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv9fl\" (UniqueName: \"kubernetes.io/projected/ee57872d-b83c-49ea-b226-5322cb6d1db3-kube-api-access-gv9fl\") pod \"multus-additional-cni-plugins-wqfht\" (UID: \"ee57872d-b83c-49ea-b226-5322cb6d1db3\") " pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.540776 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.540749 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wpfp\" (UniqueName: \"kubernetes.io/projected/b153d070-3e87-4322-a47e-5cefb6aded60-kube-api-access-6wpfp\") pod \"aws-ebs-csi-driver-node-g68pm\" (UID: \"b153d070-3e87-4322-a47e-5cefb6aded60\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.541107 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.541087 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpx5b\" (UniqueName: \"kubernetes.io/projected/5949893b-cd3d-46d5-b194-4ef1ad542b81-kube-api-access-dpx5b\") pod \"ovnkube-node-nfkqz\" (UID: \"5949893b-cd3d-46d5-b194-4ef1ad542b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.541229 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.541208 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25hbm\" (UniqueName: \"kubernetes.io/projected/398390dc-662b-42ee-b57e-22175922f0ac-kube-api-access-25hbm\") pod \"iptables-alerter-rflcs\" (UID: \"398390dc-662b-42ee-b57e-22175922f0ac\") " pod="openshift-network-operator/iptables-alerter-rflcs" Apr 23 16:35:16.541763 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.541743 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jhh9\" (UniqueName: \"kubernetes.io/projected/17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6-kube-api-access-7jhh9\") pod \"tuned-9qvrt\" (UID: \"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6\") " pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.622076 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622041 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-cnibin\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622076 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622079 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-var-lib-kubelet\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622099 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/047666b9-5e8b-4117-8317-ca917bf89757-kubelet-config\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:16.622329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b70dcd33-861c-4a46-8752-421c750db1ff-cni-binary-copy\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622165 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-multus-socket-dir-parent\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622174 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-cnibin\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622177 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-var-lib-kubelet\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622188 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b70dcd33-861c-4a46-8752-421c750db1ff-multus-daemon-config\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622232 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-run-multus-certs\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-multus-socket-dir-parent\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622198 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/047666b9-5e8b-4117-8317-ca917bf89757-kubelet-config\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:16.622329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622306 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-run-multus-certs\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622307 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/047666b9-5e8b-4117-8317-ca917bf89757-dbus\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622340 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-run-k8s-cni-cncf-io\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622361 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgjb8\" (UniqueName: \"kubernetes.io/projected/b70dcd33-861c-4a46-8752-421c750db1ff-kube-api-access-rgjb8\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622384 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-var-lib-cni-multus\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622410 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-run-k8s-cni-cncf-io\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622429 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-multus-cni-dir\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622457 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-var-lib-cni-multus\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622421 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/047666b9-5e8b-4117-8317-ca917bf89757-dbus\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622473 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-multus-cni-dir\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622486 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-hostroot\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622515 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-run-netns\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622541 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-multus-conf-dir\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622580 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-os-release\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622602 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-hostroot\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622604 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622642 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-system-cni-dir\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622666 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-var-lib-cni-bin\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622671 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-run-netns\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.622901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622674 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-multus-conf-dir\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.623847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622691 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-etc-kubernetes\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.623847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622722 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-os-release\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.623847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622726 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-host-var-lib-cni-bin\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.623847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622750 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b70dcd33-861c-4a46-8752-421c750db1ff-multus-daemon-config\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.623847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622767 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-system-cni-dir\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.623847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622774 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b70dcd33-861c-4a46-8752-421c750db1ff-cni-binary-copy\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.623847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.622770 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b70dcd33-861c-4a46-8752-421c750db1ff-etc-kubernetes\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.623847 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:16.622828 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:16.623847 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:16.622881 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret podName:047666b9-5e8b-4117-8317-ca917bf89757 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:17.122864465 +0000 UTC m=+3.183938699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret") pod "global-pull-secret-syncer-tr482" (UID: "047666b9-5e8b-4117-8317-ca917bf89757") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:16.632483 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.632462 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgjb8\" (UniqueName: \"kubernetes.io/projected/b70dcd33-861c-4a46-8752-421c750db1ff-kube-api-access-rgjb8\") pod \"multus-xs2mg\" (UID: \"b70dcd33-861c-4a46-8752-421c750db1ff\") " pod="openshift-multus/multus-xs2mg" Apr 23 16:35:16.632938 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.632921 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:16.714402 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.714312 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:16.721960 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.721936 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5pdt" Apr 23 16:35:16.729676 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.729657 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:16.734313 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.734280 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" Apr 23 16:35:16.740831 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.740815 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rflcs" Apr 23 16:35:16.747331 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.747314 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z94bz" Apr 23 16:35:16.753931 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.753910 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" Apr 23 16:35:16.760523 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.760499 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wqfht" Apr 23 16:35:16.765094 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:16.765076 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xs2mg" Apr 23 16:35:17.025125 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.025094 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:17.025325 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:17.025207 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:17.025325 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:17.025259 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs podName:4033b659-eaae-4ad3-a8a3-523bdf5fcf89 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:18.025242301 +0000 UTC m=+4.086316534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs") pod "network-metrics-daemon-5ps7g" (UID: "4033b659-eaae-4ad3-a8a3-523bdf5fcf89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:17.091030 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:17.090835 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb70dcd33_861c_4a46_8752_421c750db1ff.slice/crio-62860ab4a68c1f264437dc809b2e628faa1d1e4452ed6353e4564ffc1ea43861 WatchSource:0}: Error finding container 62860ab4a68c1f264437dc809b2e628faa1d1e4452ed6353e4564ffc1ea43861: Status 404 returned error can't find the container with id 62860ab4a68c1f264437dc809b2e628faa1d1e4452ed6353e4564ffc1ea43861 Apr 23 16:35:17.092673 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:17.092644 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5da0035a_7e6e_4e50_9404_1dde996e4313.slice/crio-07f4c7721d1481b1a56c46aeddbc20c70d34adf7f47e78ffa6856a5226850477 WatchSource:0}: Error finding container 07f4c7721d1481b1a56c46aeddbc20c70d34adf7f47e78ffa6856a5226850477: Status 404 returned error can't find the container with id 07f4c7721d1481b1a56c46aeddbc20c70d34adf7f47e78ffa6856a5226850477 Apr 23 16:35:17.094212 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:17.094185 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5949893b_cd3d_46d5_b194_4ef1ad542b81.slice/crio-62e8ef140d507372f5de65dc183be0d4581dc2cdab7542d37f9c2c5e4c03060e WatchSource:0}: Error finding container 62e8ef140d507372f5de65dc183be0d4581dc2cdab7542d37f9c2c5e4c03060e: Status 404 returned error can't find the container with id 62e8ef140d507372f5de65dc183be0d4581dc2cdab7542d37f9c2c5e4c03060e Apr 23 16:35:17.095076 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:17.095046 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb153d070_3e87_4322_a47e_5cefb6aded60.slice/crio-5dfe764d0d7ed8933f7fae8efcc617de41826522a279071f2bfb9b1bb7b8129f WatchSource:0}: Error finding container 5dfe764d0d7ed8933f7fae8efcc617de41826522a279071f2bfb9b1bb7b8129f: Status 404 returned error can't find the container with id 5dfe764d0d7ed8933f7fae8efcc617de41826522a279071f2bfb9b1bb7b8129f Apr 23 16:35:17.098218 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:17.098194 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17d9d4eb_d09b_4544_b4c2_6f2bd29c53a6.slice/crio-9642e4c2e021f1840fb7e16284dddf60bd1348e791b5b23e8d0dff373597c757 WatchSource:0}: Error finding container 9642e4c2e021f1840fb7e16284dddf60bd1348e791b5b23e8d0dff373597c757: Status 404 returned error can't find the container with id 9642e4c2e021f1840fb7e16284dddf60bd1348e791b5b23e8d0dff373597c757 Apr 23 16:35:17.098883 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:17.098865 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod398390dc_662b_42ee_b57e_22175922f0ac.slice/crio-25b727856387e4409100eb700641337929b553694073b58f0a5dc43387afef20 WatchSource:0}: Error finding container 25b727856387e4409100eb700641337929b553694073b58f0a5dc43387afef20: Status 404 returned error can't find the container with id 25b727856387e4409100eb700641337929b553694073b58f0a5dc43387afef20 Apr 23 16:35:17.099922 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:17.099858 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee57872d_b83c_49ea_b226_5322cb6d1db3.slice/crio-0d5aa24b94381b4fb8a0e19ede7d66e59d9f18332043a59e9b74cbbbff4116f8 WatchSource:0}: Error finding container 0d5aa24b94381b4fb8a0e19ede7d66e59d9f18332043a59e9b74cbbbff4116f8: Status 404 returned error can't find the container with id 0d5aa24b94381b4fb8a0e19ede7d66e59d9f18332043a59e9b74cbbbff4116f8 Apr 23 16:35:17.100743 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:17.100671 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e6b0d15_ef44_4cfa_b68b_1d1110a5e3b4.slice/crio-b032663dbcdc0affc662b4a69f219f76dc0b314d3f3fa9ec8822e5b54388665d WatchSource:0}: Error finding container b032663dbcdc0affc662b4a69f219f76dc0b314d3f3fa9ec8822e5b54388665d: Status 404 returned error can't find the container with id b032663dbcdc0affc662b4a69f219f76dc0b314d3f3fa9ec8822e5b54388665d Apr 23 16:35:17.126342 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.126317 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:17.126435 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:17.126382 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:17.126435 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.126397 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjms6\" (UniqueName: \"kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6\") pod \"network-check-target-kh9hh\" (UID: \"3c0ebdcb-a7e8-4e29-a486-52aed308cf33\") " pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:17.126435 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:17.126434 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret podName:047666b9-5e8b-4117-8317-ca917bf89757 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:18.126417787 +0000 UTC m=+4.187492023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret") pod "global-pull-secret-syncer-tr482" (UID: "047666b9-5e8b-4117-8317-ca917bf89757") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:17.126556 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:17.126477 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:17.126556 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:17.126488 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:17.126556 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:17.126496 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pjms6 for pod openshift-network-diagnostics/network-check-target-kh9hh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:17.126556 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:17.126530 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6 podName:3c0ebdcb-a7e8-4e29-a486-52aed308cf33 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:18.126522193 +0000 UTC m=+4.187596429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pjms6" (UniqueName: "kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6") pod "network-check-target-kh9hh" (UID: "3c0ebdcb-a7e8-4e29-a486-52aed308cf33") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:17.526304 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.526219 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:15 +0000 UTC" deadline="2027-09-22 15:38:32.613545445 +0000 UTC" Apr 23 16:35:17.526304 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.526257 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12407h3m15.08729171s" Apr 23 16:35:17.532788 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.532310 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:17.532788 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:17.532443 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:17.562099 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.561983 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-198.ec2.internal" event={"ID":"49434315036770a524de2f8664b84004","Type":"ContainerStarted","Data":"570894e80a5ef6f87694710055789dff0d0a37504f7765c1c5f9e27c1be80c57"} Apr 23 16:35:17.576533 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.576463 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q8rd7" event={"ID":"268f3349-6678-43d5-8596-698c807f908a","Type":"ContainerStarted","Data":"1874745c549e88e35ddc340496662917a6508c3b05886e104d707fc2bbb342b0"} Apr 23 16:35:17.581914 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.581857 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z94bz" event={"ID":"8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4","Type":"ContainerStarted","Data":"b032663dbcdc0affc662b4a69f219f76dc0b314d3f3fa9ec8822e5b54388665d"} Apr 23 16:35:17.590829 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.590786 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rflcs" event={"ID":"398390dc-662b-42ee-b57e-22175922f0ac","Type":"ContainerStarted","Data":"25b727856387e4409100eb700641337929b553694073b58f0a5dc43387afef20"} Apr 23 16:35:17.597130 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.597101 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" event={"ID":"5949893b-cd3d-46d5-b194-4ef1ad542b81","Type":"ContainerStarted","Data":"62e8ef140d507372f5de65dc183be0d4581dc2cdab7542d37f9c2c5e4c03060e"} Apr 23 16:35:17.612184 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.612151 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5pdt" event={"ID":"5da0035a-7e6e-4e50-9404-1dde996e4313","Type":"ContainerStarted","Data":"07f4c7721d1481b1a56c46aeddbc20c70d34adf7f47e78ffa6856a5226850477"} Apr 23 16:35:17.615527 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.615481 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xs2mg" event={"ID":"b70dcd33-861c-4a46-8752-421c750db1ff","Type":"ContainerStarted","Data":"62860ab4a68c1f264437dc809b2e628faa1d1e4452ed6353e4564ffc1ea43861"} Apr 23 16:35:17.618828 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.618781 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqfht" event={"ID":"ee57872d-b83c-49ea-b226-5322cb6d1db3","Type":"ContainerStarted","Data":"0d5aa24b94381b4fb8a0e19ede7d66e59d9f18332043a59e9b74cbbbff4116f8"} Apr 23 16:35:17.634094 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.634055 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" event={"ID":"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6","Type":"ContainerStarted","Data":"9642e4c2e021f1840fb7e16284dddf60bd1348e791b5b23e8d0dff373597c757"} Apr 23 16:35:17.638090 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:17.638062 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" event={"ID":"b153d070-3e87-4322-a47e-5cefb6aded60","Type":"ContainerStarted","Data":"5dfe764d0d7ed8933f7fae8efcc617de41826522a279071f2bfb9b1bb7b8129f"} Apr 23 16:35:18.033557 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:18.033515 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:18.033748 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:18.033664 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:18.033748 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:18.033737 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs podName:4033b659-eaae-4ad3-a8a3-523bdf5fcf89 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:20.033717578 +0000 UTC m=+6.094791811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs") pod "network-metrics-daemon-5ps7g" (UID: "4033b659-eaae-4ad3-a8a3-523bdf5fcf89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:18.134914 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:18.134877 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjms6\" (UniqueName: \"kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6\") pod \"network-check-target-kh9hh\" (UID: \"3c0ebdcb-a7e8-4e29-a486-52aed308cf33\") " pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:18.135092 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:18.134950 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:18.135092 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:18.135088 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:18.135202 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:18.135109 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:18.135202 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:18.135135 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:18.135202 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:18.135148 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pjms6 for pod openshift-network-diagnostics/network-check-target-kh9hh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:18.135202 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:18.135150 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret podName:047666b9-5e8b-4117-8317-ca917bf89757 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:20.135133082 +0000 UTC m=+6.196207318 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret") pod "global-pull-secret-syncer-tr482" (UID: "047666b9-5e8b-4117-8317-ca917bf89757") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:18.135433 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:18.135200 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6 podName:3c0ebdcb-a7e8-4e29-a486-52aed308cf33 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:20.135184076 +0000 UTC m=+6.196258312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pjms6" (UniqueName: "kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6") pod "network-check-target-kh9hh" (UID: "3c0ebdcb-a7e8-4e29-a486-52aed308cf33") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:18.533131 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:18.532622 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:18.533131 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:18.532756 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:18.533994 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:18.533858 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:18.533994 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:18.533956 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:18.647869 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:18.647781 2580 generic.go:358] "Generic (PLEG): container finished" podID="fd0ef97a99ad7f8bb0953574fa7327d0" containerID="0700dac565795bb4a70a9513b700fd56ad6eeb72ffce9f7198939c50d835598d" exitCode=0 Apr 23 16:35:18.649003 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:18.648708 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" event={"ID":"fd0ef97a99ad7f8bb0953574fa7327d0","Type":"ContainerDied","Data":"0700dac565795bb4a70a9513b700fd56ad6eeb72ffce9f7198939c50d835598d"} Apr 23 16:35:18.665206 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:18.665157 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-198.ec2.internal" podStartSLOduration=3.6651377 podStartE2EDuration="3.6651377s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:17.608693557 +0000 UTC m=+3.669767814" watchObservedRunningTime="2026-04-23 16:35:18.6651377 +0000 UTC m=+4.726211956" Apr 23 16:35:19.532730 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:19.532693 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:19.532907 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:19.532867 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:19.653073 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:19.653032 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" event={"ID":"fd0ef97a99ad7f8bb0953574fa7327d0","Type":"ContainerStarted","Data":"07c96e451eacae3d27169cb85bcc4a4803def5f99f3266b47f30ec4609ee689e"} Apr 23 16:35:19.669194 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:19.669140 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-198.ec2.internal" podStartSLOduration=4.669120631 podStartE2EDuration="4.669120631s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:19.668200585 +0000 UTC m=+5.729274852" watchObservedRunningTime="2026-04-23 16:35:19.669120631 +0000 UTC m=+5.730194887" Apr 23 16:35:20.051916 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:20.051861 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:20.052086 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:20.052021 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:20.052086 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:20.052084 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs podName:4033b659-eaae-4ad3-a8a3-523bdf5fcf89 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:24.052066399 +0000 UTC m=+10.113140634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs") pod "network-metrics-daemon-5ps7g" (UID: "4033b659-eaae-4ad3-a8a3-523bdf5fcf89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:20.153830 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:20.152978 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjms6\" (UniqueName: \"kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6\") pod \"network-check-target-kh9hh\" (UID: \"3c0ebdcb-a7e8-4e29-a486-52aed308cf33\") " pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:20.153830 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:20.153050 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:20.153830 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:20.153172 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:20.153830 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:20.153234 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret podName:047666b9-5e8b-4117-8317-ca917bf89757 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:24.153217405 +0000 UTC m=+10.214291638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret") pod "global-pull-secret-syncer-tr482" (UID: "047666b9-5e8b-4117-8317-ca917bf89757") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:20.153830 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:20.153709 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:20.153830 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:20.153730 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:20.153830 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:20.153742 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pjms6 for pod openshift-network-diagnostics/network-check-target-kh9hh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:20.153830 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:20.153789 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6 podName:3c0ebdcb-a7e8-4e29-a486-52aed308cf33 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:24.153774237 +0000 UTC m=+10.214848471 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-pjms6" (UniqueName: "kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6") pod "network-check-target-kh9hh" (UID: "3c0ebdcb-a7e8-4e29-a486-52aed308cf33") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:20.533928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:20.533414 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:20.533928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:20.533464 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:20.533928 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:20.533558 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:20.533928 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:20.533666 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:21.532950 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:21.532720 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:21.532950 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:21.532867 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:22.533249 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:22.533212 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:22.533708 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:22.533350 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:22.533769 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:22.533723 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:22.533816 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:22.533786 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:23.533267 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:23.533234 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:23.533741 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:23.533405 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:24.087356 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:24.086715 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:24.087356 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:24.086906 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:24.087356 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:24.086972 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs podName:4033b659-eaae-4ad3-a8a3-523bdf5fcf89 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:32.086952852 +0000 UTC m=+18.148027087 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs") pod "network-metrics-daemon-5ps7g" (UID: "4033b659-eaae-4ad3-a8a3-523bdf5fcf89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:24.188149 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:24.188106 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:24.188363 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:24.188205 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjms6\" (UniqueName: \"kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6\") pod \"network-check-target-kh9hh\" (UID: \"3c0ebdcb-a7e8-4e29-a486-52aed308cf33\") " pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:24.188363 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:24.188354 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:24.188483 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:24.188373 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:24.188483 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:24.188387 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pjms6 for pod openshift-network-diagnostics/network-check-target-kh9hh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:24.188483 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:24.188446 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6 podName:3c0ebdcb-a7e8-4e29-a486-52aed308cf33 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:32.188427362 +0000 UTC m=+18.249501621 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-pjms6" (UniqueName: "kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6") pod "network-check-target-kh9hh" (UID: "3c0ebdcb-a7e8-4e29-a486-52aed308cf33") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:24.188708 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:24.188669 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:24.188790 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:24.188724 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret podName:047666b9-5e8b-4117-8317-ca917bf89757 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:32.188710911 +0000 UTC m=+18.249785147 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret") pod "global-pull-secret-syncer-tr482" (UID: "047666b9-5e8b-4117-8317-ca917bf89757") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:24.532897 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:24.532858 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:24.533066 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:24.532999 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:24.533258 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:24.533218 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:24.534375 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:24.534324 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:25.532351 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:25.532309 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:25.532524 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:25.532461 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:26.536108 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:26.536033 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:26.536108 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:26.536047 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:26.536610 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:26.536139 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:26.536610 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:26.536208 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:27.533306 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:27.533269 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:27.533481 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:27.533400 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:28.532956 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:28.532918 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:28.533457 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:28.533052 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:28.533457 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:28.533114 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:28.533457 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:28.533223 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:29.532923 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:29.532890 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:29.533157 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:29.533031 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:30.532246 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:30.532212 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:30.532423 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:30.532349 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:30.532423 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:30.532403 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:30.532544 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:30.532490 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:31.532792 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:31.532756 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:31.533206 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:31.532904 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:32.146249 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:32.146205 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:32.146421 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:32.146370 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:32.146499 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:32.146448 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs podName:4033b659-eaae-4ad3-a8a3-523bdf5fcf89 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:48.146426623 +0000 UTC m=+34.207500910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs") pod "network-metrics-daemon-5ps7g" (UID: "4033b659-eaae-4ad3-a8a3-523bdf5fcf89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:32.247086 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:32.247045 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjms6\" (UniqueName: \"kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6\") pod \"network-check-target-kh9hh\" (UID: \"3c0ebdcb-a7e8-4e29-a486-52aed308cf33\") " pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:32.247244 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:32.247101 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:32.247244 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:32.247206 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:32.247386 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:32.247265 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret podName:047666b9-5e8b-4117-8317-ca917bf89757 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:48.247250722 +0000 UTC m=+34.308324953 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret") pod "global-pull-secret-syncer-tr482" (UID: "047666b9-5e8b-4117-8317-ca917bf89757") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:32.247386 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:32.247207 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:32.247386 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:32.247322 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:32.247386 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:32.247336 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pjms6 for pod openshift-network-diagnostics/network-check-target-kh9hh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:32.247386 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:32.247373 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6 podName:3c0ebdcb-a7e8-4e29-a486-52aed308cf33 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:48.247361886 +0000 UTC m=+34.308436131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-pjms6" (UniqueName: "kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6") pod "network-check-target-kh9hh" (UID: "3c0ebdcb-a7e8-4e29-a486-52aed308cf33") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:32.532624 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:32.532589 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:32.532802 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:32.532627 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:32.532802 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:32.532709 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:32.533181 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:32.532831 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:33.532795 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:33.532759 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:33.532968 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:33.532887 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:34.493994 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:34.493719 2580 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee57872d_b83c_49ea_b226_5322cb6d1db3.slice/crio-6ba2b0658fb0622fd4663cc892b1369b432c9a60b850139b49ab865fd40232d6.scope\": RecentStats: unable to find data in memory cache]" Apr 23 16:35:34.533626 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.533599 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:34.534261 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.533646 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:34.534261 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:34.533709 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:34.534261 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:34.533755 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:34.681816 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.681782 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5pdt" event={"ID":"5da0035a-7e6e-4e50-9404-1dde996e4313","Type":"ContainerStarted","Data":"b793d522196c2246d987daab9a0d9a19d37029a72209d5274529cd76987c963a"} Apr 23 16:35:34.683656 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.683628 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xs2mg" event={"ID":"b70dcd33-861c-4a46-8752-421c750db1ff","Type":"ContainerStarted","Data":"20e1808b7c6aebd4094a0e81d0c888221af5ac7d6adf40f00891745f9ee27a01"} Apr 23 16:35:34.685213 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.685187 2580 generic.go:358] "Generic (PLEG): container finished" podID="ee57872d-b83c-49ea-b226-5322cb6d1db3" containerID="6ba2b0658fb0622fd4663cc892b1369b432c9a60b850139b49ab865fd40232d6" exitCode=0 Apr 23 16:35:34.685320 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.685272 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqfht" event={"ID":"ee57872d-b83c-49ea-b226-5322cb6d1db3","Type":"ContainerDied","Data":"6ba2b0658fb0622fd4663cc892b1369b432c9a60b850139b49ab865fd40232d6"} Apr 23 16:35:34.689388 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.687983 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" event={"ID":"17d9d4eb-d09b-4544-b4c2-6f2bd29c53a6","Type":"ContainerStarted","Data":"66e90d27a5ab6b4a1f092a2f8fc596c6f2685e50c9731066cb2c21f24e9afd2b"} Apr 23 16:35:34.692276 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.692216 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" event={"ID":"b153d070-3e87-4322-a47e-5cefb6aded60","Type":"ContainerStarted","Data":"c70a98ad456994b082f6632bcd7e44a56c869f6f7349e2fea47d857b23a405f9"} Apr 23 16:35:34.694093 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.694068 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q8rd7" event={"ID":"268f3349-6678-43d5-8596-698c807f908a","Type":"ContainerStarted","Data":"bf50360d4e388ffbc84a1b5cd3206b676ce3eb9d1567c78ca8224716f6b4502f"} Apr 23 16:35:34.695722 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.695696 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z94bz" event={"ID":"8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4","Type":"ContainerStarted","Data":"394262194c77c15447aff9935da98f2cc9e8b66914e171771f52a65bb250f59c"} Apr 23 16:35:34.698243 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.698227 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:35:34.698580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.698562 2580 generic.go:358] "Generic (PLEG): container finished" podID="5949893b-cd3d-46d5-b194-4ef1ad542b81" containerID="908990aeab928e1ec985166c85350a6bbdc167452599195aaf92db1fff96754b" exitCode=1 Apr 23 16:35:34.698678 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.698605 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" event={"ID":"5949893b-cd3d-46d5-b194-4ef1ad542b81","Type":"ContainerStarted","Data":"e196ff365587fa2c25bc682ae67959ba597ca7f5b0683369004fe942eadc7d66"} Apr 23 16:35:34.698678 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.698625 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" event={"ID":"5949893b-cd3d-46d5-b194-4ef1ad542b81","Type":"ContainerStarted","Data":"aa88f37d46f50a858e59864e4d26f236bc04debc2878f1c5caddeb62615710e5"} Apr 23 16:35:34.698678 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.698640 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" event={"ID":"5949893b-cd3d-46d5-b194-4ef1ad542b81","Type":"ContainerStarted","Data":"36020bba0f955cba5d6aad1db476af6c2eed23cddb61d2e0f0cadad7fb7c299e"} Apr 23 16:35:34.698678 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.698671 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" event={"ID":"5949893b-cd3d-46d5-b194-4ef1ad542b81","Type":"ContainerDied","Data":"908990aeab928e1ec985166c85350a6bbdc167452599195aaf92db1fff96754b"} Apr 23 16:35:34.698825 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.698688 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" event={"ID":"5949893b-cd3d-46d5-b194-4ef1ad542b81","Type":"ContainerStarted","Data":"e140459db1d13ddd09cb56ec00a816cc1d34ee85cc91a298a75dfedfd5f31c1e"} Apr 23 16:35:34.704323 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.704264 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w5pdt" podStartSLOduration=3.877720377 podStartE2EDuration="20.704249872s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.094185273 +0000 UTC m=+3.155259506" lastFinishedPulling="2026-04-23 16:35:33.920714765 +0000 UTC m=+19.981789001" observedRunningTime="2026-04-23 16:35:34.703734725 +0000 UTC m=+20.764808982" watchObservedRunningTime="2026-04-23 16:35:34.704249872 +0000 UTC m=+20.765324127" Apr 23 16:35:34.724594 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.724451 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xs2mg" podStartSLOduration=3.856141312 podStartE2EDuration="20.72443883s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.092514809 +0000 UTC m=+3.153589050" lastFinishedPulling="2026-04-23 16:35:33.960812335 +0000 UTC m=+20.021886568" observedRunningTime="2026-04-23 16:35:34.724233002 +0000 UTC m=+20.785307269" watchObservedRunningTime="2026-04-23 16:35:34.72443883 +0000 UTC m=+20.785513086" Apr 23 16:35:34.794630 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.794572 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q8rd7" podStartSLOduration=3.986349446 podStartE2EDuration="20.794556526s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.103731724 +0000 UTC m=+3.164805967" lastFinishedPulling="2026-04-23 16:35:33.911938801 +0000 UTC m=+19.973013047" observedRunningTime="2026-04-23 16:35:34.775846308 +0000 UTC m=+20.836920563" watchObservedRunningTime="2026-04-23 16:35:34.794556526 +0000 UTC m=+20.855630779" Apr 23 16:35:34.794809 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:34.794659 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9qvrt" podStartSLOduration=3.975654125 podStartE2EDuration="20.79465429s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.100052684 +0000 UTC m=+3.161126916" lastFinishedPulling="2026-04-23 16:35:33.919052848 +0000 UTC m=+19.980127081" observedRunningTime="2026-04-23 16:35:34.793882373 +0000 UTC m=+20.854956628" watchObservedRunningTime="2026-04-23 16:35:34.79465429 +0000 UTC m=+20.855728545" Apr 23 16:35:35.532445 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:35.532412 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:35.532577 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:35.532551 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:35.549351 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:35.549327 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:35:35.702125 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:35.701859 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" event={"ID":"b153d070-3e87-4322-a47e-5cefb6aded60","Type":"ContainerStarted","Data":"168218e4e2a9869e32fb52c96c604c80c70044eeb8111dcc4caa820e97848cfc"} Apr 23 16:35:35.703174 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:35.703142 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rflcs" event={"ID":"398390dc-662b-42ee-b57e-22175922f0ac","Type":"ContainerStarted","Data":"80662989ca404069ca9042dd7519dccc3ac25bd6d0783712bb1babec43bc433d"} Apr 23 16:35:35.705848 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:35.705818 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:35:35.706271 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:35.706243 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" event={"ID":"5949893b-cd3d-46d5-b194-4ef1ad542b81","Type":"ContainerStarted","Data":"9e5ac65a3375b33a20166e8104f15957054597075614e4dd85bafe0579b71879"} Apr 23 16:35:35.731427 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:35.731384 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rflcs" podStartSLOduration=4.91627709 podStartE2EDuration="21.731366558s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.100919999 +0000 UTC m=+3.161994242" lastFinishedPulling="2026-04-23 16:35:33.916009477 +0000 UTC m=+19.977083710" observedRunningTime="2026-04-23 16:35:35.730047684 +0000 UTC m=+21.791121938" watchObservedRunningTime="2026-04-23 16:35:35.731366558 +0000 UTC m=+21.792440813" Apr 23 16:35:35.731619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:35.731595 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-z94bz" podStartSLOduration=4.918675353 podStartE2EDuration="21.731579453s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.103028893 +0000 UTC m=+3.164103132" lastFinishedPulling="2026-04-23 16:35:33.915932996 +0000 UTC m=+19.977007232" observedRunningTime="2026-04-23 16:35:34.812613431 +0000 UTC m=+20.873687682" watchObservedRunningTime="2026-04-23 16:35:35.731579453 +0000 UTC m=+21.792653712" Apr 23 16:35:36.475995 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:36.475898 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:35:35.549344777Z","UUID":"cf1b81b6-7032-473d-a71a-39c670b76cb4","Handler":null,"Name":"","Endpoint":""} Apr 23 16:35:36.477704 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:36.477682 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:35:36.477704 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:36.477712 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:35:36.532970 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:36.532933 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:36.533133 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:36.533055 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:36.533133 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:36.533063 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:36.533233 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:36.533147 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:37.533123 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:37.533087 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:37.533631 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:37.533227 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:37.712941 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:37.712901 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" event={"ID":"b153d070-3e87-4322-a47e-5cefb6aded60","Type":"ContainerStarted","Data":"206a37bb6577b9f3b8574f29e99237f1975d2f8905af464f177d97023585ed86"} Apr 23 16:35:37.715965 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:37.715944 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:35:37.716334 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:37.716308 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" event={"ID":"5949893b-cd3d-46d5-b194-4ef1ad542b81","Type":"ContainerStarted","Data":"3933c3fb98192cda171d6f1f41c7d0ab0836d681a626dd6b44947bf710158af3"} Apr 23 16:35:37.732094 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:37.732047 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g68pm" podStartSLOduration=4.053568464 podStartE2EDuration="23.732034544s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.096696975 +0000 UTC m=+3.157771210" lastFinishedPulling="2026-04-23 16:35:36.775163055 +0000 UTC m=+22.836237290" observedRunningTime="2026-04-23 16:35:37.731963659 +0000 UTC m=+23.793037938" watchObservedRunningTime="2026-04-23 16:35:37.732034544 +0000 UTC m=+23.793108777" Apr 23 16:35:38.536021 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:38.535993 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:38.536612 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:38.536000 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:38.536612 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:38.536097 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:38.536612 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:38.536199 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:39.259177 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:39.259010 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:39.259647 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:39.259629 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:39.532413 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:39.532338 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:39.532531 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:39.532453 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:39.721813 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:39.721780 2580 generic.go:358] "Generic (PLEG): container finished" podID="ee57872d-b83c-49ea-b226-5322cb6d1db3" containerID="3cb4a2843db30b1763c9136bb218a86fa56d699be2dd26cd20ff39fef4d418c8" exitCode=0 Apr 23 16:35:39.722259 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:39.721866 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqfht" event={"ID":"ee57872d-b83c-49ea-b226-5322cb6d1db3","Type":"ContainerDied","Data":"3cb4a2843db30b1763c9136bb218a86fa56d699be2dd26cd20ff39fef4d418c8"} Apr 23 16:35:39.724827 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:39.724810 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:35:39.725186 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:39.725159 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" event={"ID":"5949893b-cd3d-46d5-b194-4ef1ad542b81","Type":"ContainerStarted","Data":"2a121337758760a35363edad97011426273b192b29f6f0e0168686985935afcd"} Apr 23 16:35:39.725436 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:39.725407 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:39.725613 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:39.725596 2580 scope.go:117] "RemoveContainer" containerID="908990aeab928e1ec985166c85350a6bbdc167452599195aaf92db1fff96754b" Apr 23 16:35:39.725853 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:39.725829 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q8rd7" Apr 23 16:35:40.535101 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.535079 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:40.535206 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.535080 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:40.535206 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:40.535174 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:40.535341 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:40.535268 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:40.731124 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.731101 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:35:40.731533 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.731459 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" event={"ID":"5949893b-cd3d-46d5-b194-4ef1ad542b81","Type":"ContainerStarted","Data":"6deb72031a4493fb94d215435f1a68bb23e795fb094887b32c1a5de2f0c08f90"} Apr 23 16:35:40.731847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.731825 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:40.731847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.731853 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:40.732011 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.731866 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:40.733565 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.733539 2580 generic.go:358] "Generic (PLEG): container finished" podID="ee57872d-b83c-49ea-b226-5322cb6d1db3" containerID="72c026762f8ab23c93ef30e25f1eb0e39f945cf39ada7f2767088f95678222ad" exitCode=0 Apr 23 16:35:40.733657 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.733604 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqfht" event={"ID":"ee57872d-b83c-49ea-b226-5322cb6d1db3","Type":"ContainerDied","Data":"72c026762f8ab23c93ef30e25f1eb0e39f945cf39ada7f2767088f95678222ad"} Apr 23 16:35:40.748136 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.748104 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:40.748995 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.748975 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:35:40.783591 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.783545 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" podStartSLOduration=9.898944647 podStartE2EDuration="26.78353252s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.096786658 +0000 UTC m=+3.157860890" lastFinishedPulling="2026-04-23 16:35:33.98137452 +0000 UTC m=+20.042448763" observedRunningTime="2026-04-23 16:35:40.783279712 +0000 UTC m=+26.844353967" watchObservedRunningTime="2026-04-23 16:35:40.78353252 +0000 UTC m=+26.844606773" Apr 23 16:35:40.949994 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.949961 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kh9hh"] Apr 23 16:35:40.950155 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.950078 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:40.950198 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:40.950183 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:40.956165 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.956145 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5ps7g"] Apr 23 16:35:40.956252 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.956242 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:40.956354 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:40.956332 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:40.958068 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.958035 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tr482"] Apr 23 16:35:40.958153 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:40.958141 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:40.958281 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:40.958262 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:41.737304 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:41.737089 2580 generic.go:358] "Generic (PLEG): container finished" podID="ee57872d-b83c-49ea-b226-5322cb6d1db3" containerID="10be1092ed56b3eb51e814d23a8a59e3a1b96d27c39e95e272f489dd8a3bd3c8" exitCode=0 Apr 23 16:35:41.737872 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:41.737166 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqfht" event={"ID":"ee57872d-b83c-49ea-b226-5322cb6d1db3","Type":"ContainerDied","Data":"10be1092ed56b3eb51e814d23a8a59e3a1b96d27c39e95e272f489dd8a3bd3c8"} Apr 23 16:35:42.539000 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:42.538968 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:42.539000 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:42.538996 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:42.539000 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:42.539004 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:42.539310 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:42.539083 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:42.539373 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:42.539306 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:42.539432 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:42.539378 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:44.535996 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:44.535932 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:44.535996 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:44.535962 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:44.535996 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:44.535976 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:44.536652 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:44.536066 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:44.536652 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:44.536189 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:44.536652 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:44.536263 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:46.533634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:46.533322 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:46.533634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:46.533393 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:46.533634 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:46.533503 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tr482" podUID="047666b9-5e8b-4117-8317-ca917bf89757" Apr 23 16:35:46.533634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:46.533593 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:46.534381 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:46.533733 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh9hh" podUID="3c0ebdcb-a7e8-4e29-a486-52aed308cf33" Apr 23 16:35:46.534381 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:46.533844 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:35:47.289440 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.289409 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-198.ec2.internal" event="NodeReady" Apr 23 16:35:47.289592 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.289537 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:35:47.341570 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.341540 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-dbbdff94c-xrmkj"] Apr 23 16:35:47.355729 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.355706 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.365963 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.365941 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zbx8c"] Apr 23 16:35:47.368705 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.368687 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 16:35:47.368820 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.368711 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-x7fs6\"" Apr 23 16:35:47.368820 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.368687 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 16:35:47.368820 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.368763 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 16:35:47.376921 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.376903 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:35:47.380606 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.380587 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-dbbdff94c-xrmkj"] Apr 23 16:35:47.381173 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.381150 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:35:47.381245 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.381186 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:35:47.381245 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.381198 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:35:47.381649 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.381628 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l9v47\"" Apr 23 16:35:47.387674 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.387656 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 16:35:47.388926 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.388905 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zbx8c"] Apr 23 16:35:47.458118 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.458091 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4vzw2"] Apr 23 16:35:47.466687 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.466663 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.472076 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472047 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4vzw2"] Apr 23 16:35:47.472214 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472114 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472323 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vvvrx\"" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472422 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472510 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-trusted-ca\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472566 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsk6d\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-kube-api-access-rsk6d\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472593 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472625 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-certificates\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472676 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472730 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-installation-pull-secrets\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472765 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a538e20-bab6-46fe-9017-54d1d693ba8c-ca-trust-extracted\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472785 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-bound-sa-token\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472890 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-image-registry-private-configuration\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.472925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.472910 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb9l4\" (UniqueName: \"kubernetes.io/projected/e8a8577c-7241-452c-bed6-2d35076dce94-kube-api-access-kb9l4\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:35:47.573901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.573877 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-config-volume\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.574360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.573916 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-installation-pull-secrets\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.574360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.573939 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a538e20-bab6-46fe-9017-54d1d693ba8c-ca-trust-extracted\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.574360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574016 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-bound-sa-token\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.574360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574083 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-image-registry-private-configuration\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.574360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574111 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb9l4\" (UniqueName: \"kubernetes.io/projected/e8a8577c-7241-452c-bed6-2d35076dce94-kube-api-access-kb9l4\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:35:47.574360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-trusted-ca\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.574360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574169 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99ppl\" (UniqueName: \"kubernetes.io/projected/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-kube-api-access-99ppl\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.574360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574197 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.574680 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574539 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsk6d\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-kube-api-access-rsk6d\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.574733 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574715 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:35:47.574797 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574780 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-certificates\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.574850 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-tmp-dir\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.574944 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574929 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.575078 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:47.575066 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:47.575112 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:47.575082 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dbbdff94c-xrmkj: secret "image-registry-tls" not found Apr 23 16:35:47.575151 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:47.575142 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls podName:0a538e20-bab6-46fe-9017-54d1d693ba8c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:48.075125036 +0000 UTC m=+34.136199283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls") pod "image-registry-dbbdff94c-xrmkj" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c") : secret "image-registry-tls" not found Apr 23 16:35:47.576731 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.576145 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-trusted-ca\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.576731 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.574230 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a538e20-bab6-46fe-9017-54d1d693ba8c-ca-trust-extracted\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.576731 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.576705 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-certificates\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.576940 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:47.576820 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:47.576940 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:47.576878 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert podName:e8a8577c-7241-452c-bed6-2d35076dce94 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:48.076855229 +0000 UTC m=+34.137929477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert") pod "ingress-canary-zbx8c" (UID: "e8a8577c-7241-452c-bed6-2d35076dce94") : secret "canary-serving-cert" not found Apr 23 16:35:47.580479 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.580361 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-image-registry-private-configuration\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.580573 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.580382 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-installation-pull-secrets\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.585528 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.585507 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-bound-sa-token\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.586808 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.586790 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb9l4\" (UniqueName: \"kubernetes.io/projected/e8a8577c-7241-452c-bed6-2d35076dce94-kube-api-access-kb9l4\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:35:47.587105 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.587085 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsk6d\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-kube-api-access-rsk6d\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:47.675309 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.675262 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99ppl\" (UniqueName: \"kubernetes.io/projected/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-kube-api-access-99ppl\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.675433 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.675322 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.675433 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.675363 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-tmp-dir\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.675433 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.675401 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-config-volume\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.675585 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:47.675446 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:47.675585 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:47.675516 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls podName:0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:48.17549711 +0000 UTC m=+34.236571347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls") pod "dns-default-4vzw2" (UID: "0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4") : secret "dns-default-metrics-tls" not found Apr 23 16:35:47.675775 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.675754 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-tmp-dir\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.675884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.675869 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-config-volume\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.685061 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.685038 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99ppl\" (UniqueName: \"kubernetes.io/projected/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-kube-api-access-99ppl\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:47.752532 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:47.752505 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqfht" event={"ID":"ee57872d-b83c-49ea-b226-5322cb6d1db3","Type":"ContainerStarted","Data":"f9e4da804d301601ec024958e0b5fe49c722e4cd8bfe7873d9168d13517396d3"} Apr 23 16:35:48.078079 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.078045 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:48.078249 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.078120 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:35:48.078249 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.078186 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:48.078249 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.078206 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dbbdff94c-xrmkj: secret "image-registry-tls" not found Apr 23 16:35:48.078249 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.078207 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:48.078407 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.078257 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert podName:e8a8577c-7241-452c-bed6-2d35076dce94 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:49.078243917 +0000 UTC m=+35.139318148 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert") pod "ingress-canary-zbx8c" (UID: "e8a8577c-7241-452c-bed6-2d35076dce94") : secret "canary-serving-cert" not found Apr 23 16:35:48.078407 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.078271 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls podName:0a538e20-bab6-46fe-9017-54d1d693ba8c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:49.078265349 +0000 UTC m=+35.139339582 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls") pod "image-registry-dbbdff94c-xrmkj" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c") : secret "image-registry-tls" not found Apr 23 16:35:48.179385 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.179353 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:48.179567 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.179413 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:48.179567 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.179523 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:48.179567 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.179527 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:48.179659 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.179577 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls podName:0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:49.17956324 +0000 UTC m=+35.240637471 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls") pod "dns-default-4vzw2" (UID: "0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4") : secret "dns-default-metrics-tls" not found Apr 23 16:35:48.179659 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.179589 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs podName:4033b659-eaae-4ad3-a8a3-523bdf5fcf89 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:20.179584048 +0000 UTC m=+66.240658279 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs") pod "network-metrics-daemon-5ps7g" (UID: "4033b659-eaae-4ad3-a8a3-523bdf5fcf89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:48.279981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.279949 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjms6\" (UniqueName: \"kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6\") pod \"network-check-target-kh9hh\" (UID: \"3c0ebdcb-a7e8-4e29-a486-52aed308cf33\") " pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:48.280085 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.280020 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:48.280140 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.280095 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:48.280140 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.280126 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:48.280232 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.280156 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:48.280232 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.280140 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret podName:047666b9-5e8b-4117-8317-ca917bf89757 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:20.280127648 +0000 UTC m=+66.341201885 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret") pod "global-pull-secret-syncer-tr482" (UID: "047666b9-5e8b-4117-8317-ca917bf89757") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:48.280232 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.280170 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pjms6 for pod openshift-network-diagnostics/network-check-target-kh9hh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:48.280232 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:48.280227 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6 podName:3c0ebdcb-a7e8-4e29-a486-52aed308cf33 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:20.280208799 +0000 UTC m=+66.341283043 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-pjms6" (UniqueName: "kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6") pod "network-check-target-kh9hh" (UID: "3c0ebdcb-a7e8-4e29-a486-52aed308cf33") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:48.532953 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.532918 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:35:48.533140 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.532971 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:35:48.533140 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.533055 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:35:48.540035 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.539120 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nldpm\"" Apr 23 16:35:48.540035 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.539138 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:35:48.540035 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.539169 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:35:48.540035 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.539387 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tpj7t\"" Apr 23 16:35:48.540035 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.539424 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:35:48.540035 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.539475 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:35:48.756323 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.756270 2580 generic.go:358] "Generic (PLEG): container finished" podID="ee57872d-b83c-49ea-b226-5322cb6d1db3" containerID="f9e4da804d301601ec024958e0b5fe49c722e4cd8bfe7873d9168d13517396d3" exitCode=0 Apr 23 16:35:48.756676 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:48.756337 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqfht" event={"ID":"ee57872d-b83c-49ea-b226-5322cb6d1db3","Type":"ContainerDied","Data":"f9e4da804d301601ec024958e0b5fe49c722e4cd8bfe7873d9168d13517396d3"} Apr 23 16:35:49.086015 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:49.085922 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:35:49.086015 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:49.085985 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:49.086197 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:49.086074 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:49.086197 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:49.086102 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dbbdff94c-xrmkj: secret "image-registry-tls" not found Apr 23 16:35:49.086197 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:49.086151 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls podName:0a538e20-bab6-46fe-9017-54d1d693ba8c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:51.086136543 +0000 UTC m=+37.147210775 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls") pod "image-registry-dbbdff94c-xrmkj" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c") : secret "image-registry-tls" not found Apr 23 16:35:49.086197 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:49.086072 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:49.086348 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:49.086218 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert podName:e8a8577c-7241-452c-bed6-2d35076dce94 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:51.086202946 +0000 UTC m=+37.147277180 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert") pod "ingress-canary-zbx8c" (UID: "e8a8577c-7241-452c-bed6-2d35076dce94") : secret "canary-serving-cert" not found Apr 23 16:35:49.186985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:49.186951 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:49.187193 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:49.187094 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:49.187193 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:49.187153 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls podName:0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:51.187138643 +0000 UTC m=+37.248212879 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls") pod "dns-default-4vzw2" (UID: "0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4") : secret "dns-default-metrics-tls" not found Apr 23 16:35:49.760733 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:49.760690 2580 generic.go:358] "Generic (PLEG): container finished" podID="ee57872d-b83c-49ea-b226-5322cb6d1db3" containerID="fbd7d1eda5cb73309647b223fe9f81c46153a705c57427fc4c52a35dbde7477d" exitCode=0 Apr 23 16:35:49.761112 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:49.760739 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqfht" event={"ID":"ee57872d-b83c-49ea-b226-5322cb6d1db3","Type":"ContainerDied","Data":"fbd7d1eda5cb73309647b223fe9f81c46153a705c57427fc4c52a35dbde7477d"} Apr 23 16:35:50.618350 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.618317 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt"] Apr 23 16:35:50.630039 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.630014 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:50.631115 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.631091 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt"] Apr 23 16:35:50.632727 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.632710 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 16:35:50.634079 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.634058 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 16:35:50.634079 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.634075 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 16:35:50.634237 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.634082 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 16:35:50.765349 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.765250 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqfht" event={"ID":"ee57872d-b83c-49ea-b226-5322cb6d1db3","Type":"ContainerStarted","Data":"3cc0e40d351947179ab3b661960ddbfd56f62aca0f3758717bc354b9ffd27873"} Apr 23 16:35:50.799944 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.799904 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gddws\" (UniqueName: \"kubernetes.io/projected/adc54c5e-4c8c-4c66-acc3-d8e38c5c074f-kube-api-access-gddws\") pod \"klusterlet-addon-workmgr-685dd846cd-c4xqt\" (UID: \"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:50.800098 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.800034 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/adc54c5e-4c8c-4c66-acc3-d8e38c5c074f-klusterlet-config\") pod \"klusterlet-addon-workmgr-685dd846cd-c4xqt\" (UID: \"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:50.800098 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.800058 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/adc54c5e-4c8c-4c66-acc3-d8e38c5c074f-tmp\") pod \"klusterlet-addon-workmgr-685dd846cd-c4xqt\" (UID: \"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:50.815855 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.815810 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wqfht" podStartSLOduration=6.416805678 podStartE2EDuration="36.815796352s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.102119299 +0000 UTC m=+3.163193533" lastFinishedPulling="2026-04-23 16:35:47.501109975 +0000 UTC m=+33.562184207" observedRunningTime="2026-04-23 16:35:50.810781244 +0000 UTC m=+36.871855492" watchObservedRunningTime="2026-04-23 16:35:50.815796352 +0000 UTC m=+36.876870605" Apr 23 16:35:50.901264 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.901226 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/adc54c5e-4c8c-4c66-acc3-d8e38c5c074f-klusterlet-config\") pod \"klusterlet-addon-workmgr-685dd846cd-c4xqt\" (UID: \"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:50.901264 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.901263 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/adc54c5e-4c8c-4c66-acc3-d8e38c5c074f-tmp\") pod \"klusterlet-addon-workmgr-685dd846cd-c4xqt\" (UID: \"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:50.901536 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.901333 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gddws\" (UniqueName: \"kubernetes.io/projected/adc54c5e-4c8c-4c66-acc3-d8e38c5c074f-kube-api-access-gddws\") pod \"klusterlet-addon-workmgr-685dd846cd-c4xqt\" (UID: \"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:50.901748 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.901723 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/adc54c5e-4c8c-4c66-acc3-d8e38c5c074f-tmp\") pod \"klusterlet-addon-workmgr-685dd846cd-c4xqt\" (UID: \"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:50.904332 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.904308 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/adc54c5e-4c8c-4c66-acc3-d8e38c5c074f-klusterlet-config\") pod \"klusterlet-addon-workmgr-685dd846cd-c4xqt\" (UID: \"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:50.915578 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.915558 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gddws\" (UniqueName: \"kubernetes.io/projected/adc54c5e-4c8c-4c66-acc3-d8e38c5c074f-kube-api-access-gddws\") pod \"klusterlet-addon-workmgr-685dd846cd-c4xqt\" (UID: \"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:50.939593 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:50.939573 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:51.103038 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:51.102866 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:35:51.103212 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:51.103069 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:51.103212 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:51.103011 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:51.103212 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:51.103146 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert podName:e8a8577c-7241-452c-bed6-2d35076dce94 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:55.103131115 +0000 UTC m=+41.164205346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert") pod "ingress-canary-zbx8c" (UID: "e8a8577c-7241-452c-bed6-2d35076dce94") : secret "canary-serving-cert" not found Apr 23 16:35:51.103389 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:51.103220 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:51.103389 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:51.103235 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dbbdff94c-xrmkj: secret "image-registry-tls" not found Apr 23 16:35:51.103389 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:51.103312 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls podName:0a538e20-bab6-46fe-9017-54d1d693ba8c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:55.103274825 +0000 UTC m=+41.164349063 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls") pod "image-registry-dbbdff94c-xrmkj" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c") : secret "image-registry-tls" not found Apr 23 16:35:51.108439 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:51.108413 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt"] Apr 23 16:35:51.112906 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:35:51.112878 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc54c5e_4c8c_4c66_acc3_d8e38c5c074f.slice/crio-b1e7dd8e6dd4cd71287dfbd49751730852df348e8d975d8b8d3575b05c6f6f6d WatchSource:0}: Error finding container b1e7dd8e6dd4cd71287dfbd49751730852df348e8d975d8b8d3575b05c6f6f6d: Status 404 returned error can't find the container with id b1e7dd8e6dd4cd71287dfbd49751730852df348e8d975d8b8d3575b05c6f6f6d Apr 23 16:35:51.203766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:51.203730 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:51.203905 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:51.203869 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:51.203959 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:51.203929 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls podName:0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:55.203915248 +0000 UTC m=+41.264989485 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls") pod "dns-default-4vzw2" (UID: "0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4") : secret "dns-default-metrics-tls" not found Apr 23 16:35:51.767889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:51.767849 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" event={"ID":"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f","Type":"ContainerStarted","Data":"b1e7dd8e6dd4cd71287dfbd49751730852df348e8d975d8b8d3575b05c6f6f6d"} Apr 23 16:35:55.136594 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:55.136503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:35:55.136594 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:55.136559 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:35:55.137158 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:55.136654 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:55.137158 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:55.136718 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert podName:e8a8577c-7241-452c-bed6-2d35076dce94 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:03.136700277 +0000 UTC m=+49.197774509 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert") pod "ingress-canary-zbx8c" (UID: "e8a8577c-7241-452c-bed6-2d35076dce94") : secret "canary-serving-cert" not found Apr 23 16:35:55.137158 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:55.136662 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:55.137158 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:55.136761 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dbbdff94c-xrmkj: secret "image-registry-tls" not found Apr 23 16:35:55.137158 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:55.136807 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls podName:0a538e20-bab6-46fe-9017-54d1d693ba8c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:03.136794831 +0000 UTC m=+49.197869062 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls") pod "image-registry-dbbdff94c-xrmkj" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c") : secret "image-registry-tls" not found Apr 23 16:35:55.237142 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:55.237105 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:35:55.237323 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:55.237278 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:55.237386 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:35:55.237372 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls podName:0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:03.23734998 +0000 UTC m=+49.298424212 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls") pod "dns-default-4vzw2" (UID: "0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4") : secret "dns-default-metrics-tls" not found Apr 23 16:35:55.777685 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:55.777648 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" event={"ID":"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f","Type":"ContainerStarted","Data":"548a9a8c338c1f255ba673bb1aee09f566c52c3e915de4b0a1fda7c2550f236c"} Apr 23 16:35:55.777880 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:55.777861 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:55.779376 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:55.779358 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:35:55.796597 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:35:55.796557 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" podStartSLOduration=2.126917938 podStartE2EDuration="5.796544992s" podCreationTimestamp="2026-04-23 16:35:50 +0000 UTC" firstStartedPulling="2026-04-23 16:35:51.114600318 +0000 UTC m=+37.175674550" lastFinishedPulling="2026-04-23 16:35:54.784227358 +0000 UTC m=+40.845301604" observedRunningTime="2026-04-23 16:35:55.795855622 +0000 UTC m=+41.856929876" watchObservedRunningTime="2026-04-23 16:35:55.796544992 +0000 UTC m=+41.857619245" Apr 23 16:36:03.193844 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:03.193769 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:36:03.194337 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:03.193862 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:36:03.194337 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:03.193934 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:03.194337 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:03.193965 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:36:03.194337 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:03.193976 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dbbdff94c-xrmkj: secret "image-registry-tls" not found Apr 23 16:36:03.194337 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:03.194025 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert podName:e8a8577c-7241-452c-bed6-2d35076dce94 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:19.194008557 +0000 UTC m=+65.255082788 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert") pod "ingress-canary-zbx8c" (UID: "e8a8577c-7241-452c-bed6-2d35076dce94") : secret "canary-serving-cert" not found Apr 23 16:36:03.194337 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:03.194041 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls podName:0a538e20-bab6-46fe-9017-54d1d693ba8c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:19.194034647 +0000 UTC m=+65.255108879 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls") pod "image-registry-dbbdff94c-xrmkj" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c") : secret "image-registry-tls" not found Apr 23 16:36:03.295098 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:03.295066 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:36:03.295244 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:03.295221 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:03.295331 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:03.295320 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls podName:0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:19.29527613 +0000 UTC m=+65.356350361 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls") pod "dns-default-4vzw2" (UID: "0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4") : secret "dns-default-metrics-tls" not found Apr 23 16:36:12.750014 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:12.749985 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfkqz" Apr 23 16:36:19.212852 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:19.212818 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:36:19.213319 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:19.212868 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:36:19.213319 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:19.212963 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:19.213319 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:19.213033 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert podName:e8a8577c-7241-452c-bed6-2d35076dce94 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:51.213018057 +0000 UTC m=+97.274092293 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert") pod "ingress-canary-zbx8c" (UID: "e8a8577c-7241-452c-bed6-2d35076dce94") : secret "canary-serving-cert" not found Apr 23 16:36:19.213319 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:19.212970 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:36:19.213319 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:19.213064 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dbbdff94c-xrmkj: secret "image-registry-tls" not found Apr 23 16:36:19.213319 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:19.213119 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls podName:0a538e20-bab6-46fe-9017-54d1d693ba8c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:51.213107921 +0000 UTC m=+97.274182153 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls") pod "image-registry-dbbdff94c-xrmkj" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c") : secret "image-registry-tls" not found Apr 23 16:36:19.313754 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:19.313708 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:36:19.313939 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:19.313822 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:19.313939 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:19.313888 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls podName:0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:51.313875227 +0000 UTC m=+97.374949459 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls") pod "dns-default-4vzw2" (UID: "0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4") : secret "dns-default-metrics-tls" not found Apr 23 16:36:20.220675 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.220636 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:36:20.224225 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.224205 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:36:20.230855 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:20.230837 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:36:20.230907 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:20.230886 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs podName:4033b659-eaae-4ad3-a8a3-523bdf5fcf89 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:24.230872574 +0000 UTC m=+130.291946807 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs") pod "network-metrics-daemon-5ps7g" (UID: "4033b659-eaae-4ad3-a8a3-523bdf5fcf89") : secret "metrics-daemon-secret" not found Apr 23 16:36:20.321868 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.321824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjms6\" (UniqueName: \"kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6\") pod \"network-check-target-kh9hh\" (UID: \"3c0ebdcb-a7e8-4e29-a486-52aed308cf33\") " pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:36:20.321991 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.321888 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:36:20.324902 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.324869 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:36:20.325002 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.324913 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:36:20.334970 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.334938 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:36:20.335129 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.335111 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/047666b9-5e8b-4117-8317-ca917bf89757-original-pull-secret\") pod \"global-pull-secret-syncer-tr482\" (UID: \"047666b9-5e8b-4117-8317-ca917bf89757\") " pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:36:20.345432 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.345409 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjms6\" (UniqueName: \"kubernetes.io/projected/3c0ebdcb-a7e8-4e29-a486-52aed308cf33-kube-api-access-pjms6\") pod \"network-check-target-kh9hh\" (UID: \"3c0ebdcb-a7e8-4e29-a486-52aed308cf33\") " pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:36:20.348220 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.348199 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tr482" Apr 23 16:36:20.465094 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.465064 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tr482"] Apr 23 16:36:20.468919 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:36:20.468888 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod047666b9_5e8b_4117_8317_ca917bf89757.slice/crio-255194ba00c589ea1ac4190d1efcd0ae04b4dbf0638d0de40f0e65774ad2341c WatchSource:0}: Error finding container 255194ba00c589ea1ac4190d1efcd0ae04b4dbf0638d0de40f0e65774ad2341c: Status 404 returned error can't find the container with id 255194ba00c589ea1ac4190d1efcd0ae04b4dbf0638d0de40f0e65774ad2341c Apr 23 16:36:20.644730 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.644696 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tpj7t\"" Apr 23 16:36:20.652814 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.652794 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:36:20.768463 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:36:20.768435 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c0ebdcb_a7e8_4e29_a486_52aed308cf33.slice/crio-93e5d11f0d54457b2fca7f2edc4f5bcef6527afb9926795d614386c4cdb4da9f WatchSource:0}: Error finding container 93e5d11f0d54457b2fca7f2edc4f5bcef6527afb9926795d614386c4cdb4da9f: Status 404 returned error can't find the container with id 93e5d11f0d54457b2fca7f2edc4f5bcef6527afb9926795d614386c4cdb4da9f Apr 23 16:36:20.779666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.779644 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kh9hh"] Apr 23 16:36:20.822847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.822815 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tr482" event={"ID":"047666b9-5e8b-4117-8317-ca917bf89757","Type":"ContainerStarted","Data":"255194ba00c589ea1ac4190d1efcd0ae04b4dbf0638d0de40f0e65774ad2341c"} Apr 23 16:36:20.823735 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:20.823714 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kh9hh" event={"ID":"3c0ebdcb-a7e8-4e29-a486-52aed308cf33","Type":"ContainerStarted","Data":"93e5d11f0d54457b2fca7f2edc4f5bcef6527afb9926795d614386c4cdb4da9f"} Apr 23 16:36:25.835682 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:25.835635 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tr482" event={"ID":"047666b9-5e8b-4117-8317-ca917bf89757","Type":"ContainerStarted","Data":"b63472376c004e02d8941cfd1ed7f3417fe3e5a3ba29dd9ec7bf94f9f6799cc6"} Apr 23 16:36:25.836874 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:25.836847 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kh9hh" event={"ID":"3c0ebdcb-a7e8-4e29-a486-52aed308cf33","Type":"ContainerStarted","Data":"8383a3442b1bc0e71db92d4e03ed737bdd5a08e376d2cbc86305ff3b497cd0ab"} Apr 23 16:36:25.837008 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:25.836992 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:36:25.854009 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:25.853964 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tr482" podStartSLOduration=65.987596278 podStartE2EDuration="1m10.853952723s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:36:20.470971388 +0000 UTC m=+66.532045620" lastFinishedPulling="2026-04-23 16:36:25.337327833 +0000 UTC m=+71.398402065" observedRunningTime="2026-04-23 16:36:25.853160197 +0000 UTC m=+71.914234451" watchObservedRunningTime="2026-04-23 16:36:25.853952723 +0000 UTC m=+71.915027009" Apr 23 16:36:25.872158 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:25.872120 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kh9hh" podStartSLOduration=67.316758045 podStartE2EDuration="1m11.872108422s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:36:20.770433138 +0000 UTC m=+66.831507370" lastFinishedPulling="2026-04-23 16:36:25.325783506 +0000 UTC m=+71.386857747" observedRunningTime="2026-04-23 16:36:25.872037471 +0000 UTC m=+71.933111725" watchObservedRunningTime="2026-04-23 16:36:25.872108422 +0000 UTC m=+71.933182675" Apr 23 16:36:51.248106 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:51.248051 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:36:51.248106 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:51.248121 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:36:51.248549 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:51.248214 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:51.248549 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:51.248277 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:36:51.248549 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:51.248315 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dbbdff94c-xrmkj: secret "image-registry-tls" not found Apr 23 16:36:51.248549 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:51.248284 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert podName:e8a8577c-7241-452c-bed6-2d35076dce94 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:55.24826814 +0000 UTC m=+161.309342375 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert") pod "ingress-canary-zbx8c" (UID: "e8a8577c-7241-452c-bed6-2d35076dce94") : secret "canary-serving-cert" not found Apr 23 16:36:51.248549 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:51.248386 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls podName:0a538e20-bab6-46fe-9017-54d1d693ba8c nodeName:}" failed. No retries permitted until 2026-04-23 16:37:55.248372802 +0000 UTC m=+161.309447034 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls") pod "image-registry-dbbdff94c-xrmkj" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c") : secret "image-registry-tls" not found Apr 23 16:36:51.348984 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:51.348950 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:36:51.349147 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:51.349094 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:51.349191 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:36:51.349165 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls podName:0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:55.349150507 +0000 UTC m=+161.410224739 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls") pod "dns-default-4vzw2" (UID: "0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4") : secret "dns-default-metrics-tls" not found Apr 23 16:36:56.841627 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:36:56.841598 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kh9hh" Apr 23 16:37:24.295315 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:24.295235 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:37:24.295726 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:37:24.295384 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:37:24.295726 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:37:24.295460 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs podName:4033b659-eaae-4ad3-a8a3-523bdf5fcf89 nodeName:}" failed. No retries permitted until 2026-04-23 16:39:26.295445904 +0000 UTC m=+252.356520135 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs") pod "network-metrics-daemon-5ps7g" (UID: "4033b659-eaae-4ad3-a8a3-523bdf5fcf89") : secret "metrics-daemon-secret" not found Apr 23 16:37:30.990306 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:30.990268 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w5pdt_5da0035a-7e6e-4e50-9404-1dde996e4313/dns-node-resolver/0.log" Apr 23 16:37:31.589720 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:31.589694 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-z94bz_8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4/node-ca/0.log" Apr 23 16:37:34.655200 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.655167 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6"] Apr 23 16:37:34.658026 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.658010 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" Apr 23 16:37:34.668815 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.668791 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt5wf\" (UniqueName: \"kubernetes.io/projected/046b94e7-f586-4a7f-b68c-ca66c74a0ab6-kube-api-access-jt5wf\") pod \"kube-storage-version-migrator-operator-6769c5d45-77tn6\" (UID: \"046b94e7-f586-4a7f-b68c-ca66c74a0ab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" Apr 23 16:37:34.668926 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.668834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/046b94e7-f586-4a7f-b68c-ca66c74a0ab6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-77tn6\" (UID: \"046b94e7-f586-4a7f-b68c-ca66c74a0ab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" Apr 23 16:37:34.668965 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.668930 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/046b94e7-f586-4a7f-b68c-ca66c74a0ab6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-77tn6\" (UID: \"046b94e7-f586-4a7f-b68c-ca66c74a0ab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" Apr 23 16:37:34.675770 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.675743 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 16:37:34.675884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.675865 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 16:37:34.676683 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.676663 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn"] Apr 23 16:37:34.677104 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.677077 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 16:37:34.677104 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.677098 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:34.677242 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.677078 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-ppmwn\"" Apr 23 16:37:34.679408 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.679393 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" Apr 23 16:37:34.684124 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.684102 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 16:37:34.684212 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.684115 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:34.684412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.684396 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 16:37:34.685107 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.685063 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-8gqkj\"" Apr 23 16:37:34.685342 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.685326 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 16:37:34.699977 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.699954 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6"] Apr 23 16:37:34.704937 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.704914 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn"] Apr 23 16:37:34.769752 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.769719 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/046b94e7-f586-4a7f-b68c-ca66c74a0ab6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-77tn6\" (UID: \"046b94e7-f586-4a7f-b68c-ca66c74a0ab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" Apr 23 16:37:34.769908 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.769767 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jt5wf\" (UniqueName: \"kubernetes.io/projected/046b94e7-f586-4a7f-b68c-ca66c74a0ab6-kube-api-access-jt5wf\") pod \"kube-storage-version-migrator-operator-6769c5d45-77tn6\" (UID: \"046b94e7-f586-4a7f-b68c-ca66c74a0ab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" Apr 23 16:37:34.769954 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.769905 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/046b94e7-f586-4a7f-b68c-ca66c74a0ab6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-77tn6\" (UID: \"046b94e7-f586-4a7f-b68c-ca66c74a0ab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" Apr 23 16:37:34.769954 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.769948 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45986c9e-a457-4f12-b928-2ef7295dbf7a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rq7kn\" (UID: \"45986c9e-a457-4f12-b928-2ef7295dbf7a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" Apr 23 16:37:34.770040 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.770002 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45986c9e-a457-4f12-b928-2ef7295dbf7a-config\") pod \"service-ca-operator-d6fc45fc5-rq7kn\" (UID: \"45986c9e-a457-4f12-b928-2ef7295dbf7a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" Apr 23 16:37:34.770040 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.770029 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zg4c\" (UniqueName: \"kubernetes.io/projected/45986c9e-a457-4f12-b928-2ef7295dbf7a-kube-api-access-5zg4c\") pod \"service-ca-operator-d6fc45fc5-rq7kn\" (UID: \"45986c9e-a457-4f12-b928-2ef7295dbf7a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" Apr 23 16:37:34.770344 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.770321 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/046b94e7-f586-4a7f-b68c-ca66c74a0ab6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-77tn6\" (UID: \"046b94e7-f586-4a7f-b68c-ca66c74a0ab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" Apr 23 16:37:34.772019 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.772002 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/046b94e7-f586-4a7f-b68c-ca66c74a0ab6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-77tn6\" (UID: \"046b94e7-f586-4a7f-b68c-ca66c74a0ab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" Apr 23 16:37:34.779136 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.779112 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt5wf\" (UniqueName: \"kubernetes.io/projected/046b94e7-f586-4a7f-b68c-ca66c74a0ab6-kube-api-access-jt5wf\") pod \"kube-storage-version-migrator-operator-6769c5d45-77tn6\" (UID: \"046b94e7-f586-4a7f-b68c-ca66c74a0ab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" Apr 23 16:37:34.871352 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.871316 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45986c9e-a457-4f12-b928-2ef7295dbf7a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rq7kn\" (UID: \"45986c9e-a457-4f12-b928-2ef7295dbf7a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" Apr 23 16:37:34.871502 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.871362 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45986c9e-a457-4f12-b928-2ef7295dbf7a-config\") pod \"service-ca-operator-d6fc45fc5-rq7kn\" (UID: \"45986c9e-a457-4f12-b928-2ef7295dbf7a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" Apr 23 16:37:34.871502 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.871379 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zg4c\" (UniqueName: \"kubernetes.io/projected/45986c9e-a457-4f12-b928-2ef7295dbf7a-kube-api-access-5zg4c\") pod \"service-ca-operator-d6fc45fc5-rq7kn\" (UID: \"45986c9e-a457-4f12-b928-2ef7295dbf7a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" Apr 23 16:37:34.871955 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.871925 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45986c9e-a457-4f12-b928-2ef7295dbf7a-config\") pod \"service-ca-operator-d6fc45fc5-rq7kn\" (UID: \"45986c9e-a457-4f12-b928-2ef7295dbf7a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" Apr 23 16:37:34.873620 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.873598 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45986c9e-a457-4f12-b928-2ef7295dbf7a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rq7kn\" (UID: \"45986c9e-a457-4f12-b928-2ef7295dbf7a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" Apr 23 16:37:34.880924 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.880899 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zg4c\" (UniqueName: \"kubernetes.io/projected/45986c9e-a457-4f12-b928-2ef7295dbf7a-kube-api-access-5zg4c\") pod \"service-ca-operator-d6fc45fc5-rq7kn\" (UID: \"45986c9e-a457-4f12-b928-2ef7295dbf7a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" Apr 23 16:37:34.965968 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.965936 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" Apr 23 16:37:34.987850 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:34.987818 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" Apr 23 16:37:35.089850 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:35.089817 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6"] Apr 23 16:37:35.092884 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:37:35.092857 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046b94e7_f586_4a7f_b68c_ca66c74a0ab6.slice/crio-5e66c39a8c0edead762f3481fcc2aeb2acaccc5ad3769538ebb1deb41e8878bc WatchSource:0}: Error finding container 5e66c39a8c0edead762f3481fcc2aeb2acaccc5ad3769538ebb1deb41e8878bc: Status 404 returned error can't find the container with id 5e66c39a8c0edead762f3481fcc2aeb2acaccc5ad3769538ebb1deb41e8878bc Apr 23 16:37:35.109636 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:35.109612 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn"] Apr 23 16:37:35.112556 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:37:35.112534 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45986c9e_a457_4f12_b928_2ef7295dbf7a.slice/crio-967e0ba169af76edc77f2dd5759373b4f0f925e3753099ae0b1374953df87153 WatchSource:0}: Error finding container 967e0ba169af76edc77f2dd5759373b4f0f925e3753099ae0b1374953df87153: Status 404 returned error can't find the container with id 967e0ba169af76edc77f2dd5759373b4f0f925e3753099ae0b1374953df87153 Apr 23 16:37:35.973885 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:35.973833 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" event={"ID":"45986c9e-a457-4f12-b928-2ef7295dbf7a","Type":"ContainerStarted","Data":"967e0ba169af76edc77f2dd5759373b4f0f925e3753099ae0b1374953df87153"} Apr 23 16:37:35.975136 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:35.975106 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" event={"ID":"046b94e7-f586-4a7f-b68c-ca66c74a0ab6","Type":"ContainerStarted","Data":"5e66c39a8c0edead762f3481fcc2aeb2acaccc5ad3769538ebb1deb41e8878bc"} Apr 23 16:37:37.980690 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:37.980648 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" event={"ID":"046b94e7-f586-4a7f-b68c-ca66c74a0ab6","Type":"ContainerStarted","Data":"669fe8612425d2ef1aa841bf298afe24adfb7f9ec7b22e39daad62a0f028b628"} Apr 23 16:37:37.981964 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:37.981938 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" event={"ID":"45986c9e-a457-4f12-b928-2ef7295dbf7a","Type":"ContainerStarted","Data":"8953babe3d006d26aba5555b2cfbe4d21d1595e5b4cd2bd1b35e92bb4edb55a8"} Apr 23 16:37:37.998955 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:37.998916 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" podStartSLOduration=1.727108079 podStartE2EDuration="3.99890507s" podCreationTimestamp="2026-04-23 16:37:34 +0000 UTC" firstStartedPulling="2026-04-23 16:37:35.094558563 +0000 UTC m=+141.155632795" lastFinishedPulling="2026-04-23 16:37:37.366355551 +0000 UTC m=+143.427429786" observedRunningTime="2026-04-23 16:37:37.998468798 +0000 UTC m=+144.059543055" watchObservedRunningTime="2026-04-23 16:37:37.99890507 +0000 UTC m=+144.059979324" Apr 23 16:37:38.016013 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:38.015965 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" podStartSLOduration=1.765870705 podStartE2EDuration="4.015953159s" podCreationTimestamp="2026-04-23 16:37:34 +0000 UTC" firstStartedPulling="2026-04-23 16:37:35.114268583 +0000 UTC m=+141.175342819" lastFinishedPulling="2026-04-23 16:37:37.364351036 +0000 UTC m=+143.425425273" observedRunningTime="2026-04-23 16:37:38.015017374 +0000 UTC m=+144.076091629" watchObservedRunningTime="2026-04-23 16:37:38.015953159 +0000 UTC m=+144.077027413" Apr 23 16:37:40.427535 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.427501 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-7hzhd"] Apr 23 16:37:40.430855 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.430834 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-7hzhd" Apr 23 16:37:40.434366 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.434343 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 16:37:40.434452 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.434343 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 16:37:40.434962 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.434948 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 16:37:40.435692 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.435675 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 16:37:40.440784 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.436092 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-9kqth\"" Apr 23 16:37:40.441221 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.440793 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-7hzhd"] Apr 23 16:37:40.513990 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.513952 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b5960f83-7182-4b8c-ae70-fd1b694f2b5e-signing-cabundle\") pod \"service-ca-865cb79987-7hzhd\" (UID: \"b5960f83-7182-4b8c-ae70-fd1b694f2b5e\") " pod="openshift-service-ca/service-ca-865cb79987-7hzhd" Apr 23 16:37:40.514142 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.514050 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5s6f\" (UniqueName: \"kubernetes.io/projected/b5960f83-7182-4b8c-ae70-fd1b694f2b5e-kube-api-access-w5s6f\") pod \"service-ca-865cb79987-7hzhd\" (UID: \"b5960f83-7182-4b8c-ae70-fd1b694f2b5e\") " pod="openshift-service-ca/service-ca-865cb79987-7hzhd" Apr 23 16:37:40.514142 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.514095 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b5960f83-7182-4b8c-ae70-fd1b694f2b5e-signing-key\") pod \"service-ca-865cb79987-7hzhd\" (UID: \"b5960f83-7182-4b8c-ae70-fd1b694f2b5e\") " pod="openshift-service-ca/service-ca-865cb79987-7hzhd" Apr 23 16:37:40.614490 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.614449 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5s6f\" (UniqueName: \"kubernetes.io/projected/b5960f83-7182-4b8c-ae70-fd1b694f2b5e-kube-api-access-w5s6f\") pod \"service-ca-865cb79987-7hzhd\" (UID: \"b5960f83-7182-4b8c-ae70-fd1b694f2b5e\") " pod="openshift-service-ca/service-ca-865cb79987-7hzhd" Apr 23 16:37:40.614490 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.614488 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b5960f83-7182-4b8c-ae70-fd1b694f2b5e-signing-key\") pod \"service-ca-865cb79987-7hzhd\" (UID: \"b5960f83-7182-4b8c-ae70-fd1b694f2b5e\") " pod="openshift-service-ca/service-ca-865cb79987-7hzhd" Apr 23 16:37:40.614702 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.614548 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b5960f83-7182-4b8c-ae70-fd1b694f2b5e-signing-cabundle\") pod \"service-ca-865cb79987-7hzhd\" (UID: \"b5960f83-7182-4b8c-ae70-fd1b694f2b5e\") " pod="openshift-service-ca/service-ca-865cb79987-7hzhd" Apr 23 16:37:40.615161 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.615141 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b5960f83-7182-4b8c-ae70-fd1b694f2b5e-signing-cabundle\") pod \"service-ca-865cb79987-7hzhd\" (UID: \"b5960f83-7182-4b8c-ae70-fd1b694f2b5e\") " pod="openshift-service-ca/service-ca-865cb79987-7hzhd" Apr 23 16:37:40.617011 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.616985 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b5960f83-7182-4b8c-ae70-fd1b694f2b5e-signing-key\") pod \"service-ca-865cb79987-7hzhd\" (UID: \"b5960f83-7182-4b8c-ae70-fd1b694f2b5e\") " pod="openshift-service-ca/service-ca-865cb79987-7hzhd" Apr 23 16:37:40.623997 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.623976 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5s6f\" (UniqueName: \"kubernetes.io/projected/b5960f83-7182-4b8c-ae70-fd1b694f2b5e-kube-api-access-w5s6f\") pod \"service-ca-865cb79987-7hzhd\" (UID: \"b5960f83-7182-4b8c-ae70-fd1b694f2b5e\") " pod="openshift-service-ca/service-ca-865cb79987-7hzhd" Apr 23 16:37:40.743364 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.743333 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-7hzhd" Apr 23 16:37:40.869872 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.869843 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-7hzhd"] Apr 23 16:37:40.873383 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:37:40.873358 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5960f83_7182_4b8c_ae70_fd1b694f2b5e.slice/crio-3b3be80f61fe9d2d6d7bcb9ec7601f74f7e10d66995e15e1fdfa12082646de7d WatchSource:0}: Error finding container 3b3be80f61fe9d2d6d7bcb9ec7601f74f7e10d66995e15e1fdfa12082646de7d: Status 404 returned error can't find the container with id 3b3be80f61fe9d2d6d7bcb9ec7601f74f7e10d66995e15e1fdfa12082646de7d Apr 23 16:37:40.990117 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.990075 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-7hzhd" event={"ID":"b5960f83-7182-4b8c-ae70-fd1b694f2b5e","Type":"ContainerStarted","Data":"f08e93961342919dd9a40cea63d5aeebd6718cdfa0d588c645da267e6bd01409"} Apr 23 16:37:40.990117 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:40.990118 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-7hzhd" event={"ID":"b5960f83-7182-4b8c-ae70-fd1b694f2b5e","Type":"ContainerStarted","Data":"3b3be80f61fe9d2d6d7bcb9ec7601f74f7e10d66995e15e1fdfa12082646de7d"} Apr 23 16:37:41.027695 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:41.027596 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-7hzhd" podStartSLOduration=1.027582241 podStartE2EDuration="1.027582241s" podCreationTimestamp="2026-04-23 16:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:37:41.027326598 +0000 UTC m=+147.088400849" watchObservedRunningTime="2026-04-23 16:37:41.027582241 +0000 UTC m=+147.088656495" Apr 23 16:37:50.364372 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:37:50.364329 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" podUID="0a538e20-bab6-46fe-9017-54d1d693ba8c" Apr 23 16:37:50.386623 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:37:50.386582 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zbx8c" podUID="e8a8577c-7241-452c-bed6-2d35076dce94" Apr 23 16:37:50.476836 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:37:50.476787 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4vzw2" podUID="0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4" Apr 23 16:37:51.016613 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:51.016582 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:37:51.016766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:51.016584 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4vzw2" Apr 23 16:37:51.016766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:51.016582 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:37:51.553226 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:37:51.553183 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-5ps7g" podUID="4033b659-eaae-4ad3-a8a3-523bdf5fcf89" Apr 23 16:37:55.026805 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.026770 2580 generic.go:358] "Generic (PLEG): container finished" podID="adc54c5e-4c8c-4c66-acc3-d8e38c5c074f" containerID="548a9a8c338c1f255ba673bb1aee09f566c52c3e915de4b0a1fda7c2550f236c" exitCode=1 Apr 23 16:37:55.027164 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.026812 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" event={"ID":"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f","Type":"ContainerDied","Data":"548a9a8c338c1f255ba673bb1aee09f566c52c3e915de4b0a1fda7c2550f236c"} Apr 23 16:37:55.027206 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.027174 2580 scope.go:117] "RemoveContainer" containerID="548a9a8c338c1f255ba673bb1aee09f566c52c3e915de4b0a1fda7c2550f236c" Apr 23 16:37:55.325967 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.325927 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:37:55.325967 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.325975 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:37:55.328365 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.328341 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a8577c-7241-452c-bed6-2d35076dce94-cert\") pod \"ingress-canary-zbx8c\" (UID: \"e8a8577c-7241-452c-bed6-2d35076dce94\") " pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:37:55.328493 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.328381 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"image-registry-dbbdff94c-xrmkj\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:37:55.426896 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.426842 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:37:55.429086 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.429066 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4-metrics-tls\") pod \"dns-default-4vzw2\" (UID: \"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4\") " pod="openshift-dns/dns-default-4vzw2" Apr 23 16:37:55.521906 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.521873 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vvvrx\"" Apr 23 16:37:55.521906 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.521873 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-x7fs6\"" Apr 23 16:37:55.522125 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.521931 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l9v47\"" Apr 23 16:37:55.528565 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.528544 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:37:55.528681 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.528638 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4vzw2" Apr 23 16:37:55.528738 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.528710 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zbx8c" Apr 23 16:37:55.667400 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.667367 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4vzw2"] Apr 23 16:37:55.671062 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:37:55.671021 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cdc9ee5_b02e_4e33_a558_3cb94cae3fe4.slice/crio-83aa0e086d9571799cb15dcd84d1737a0ee8ef7fb3d6895c5e0a97fcaae7538c WatchSource:0}: Error finding container 83aa0e086d9571799cb15dcd84d1737a0ee8ef7fb3d6895c5e0a97fcaae7538c: Status 404 returned error can't find the container with id 83aa0e086d9571799cb15dcd84d1737a0ee8ef7fb3d6895c5e0a97fcaae7538c Apr 23 16:37:55.778061 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.778018 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:37:55.893599 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.893532 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zbx8c"] Apr 23 16:37:55.896456 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:37:55.896424 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8a8577c_7241_452c_bed6_2d35076dce94.slice/crio-e354b6c39760c1c34c880190f6da54945a2a2266d7b7c2ea35aa2db02edaeec7 WatchSource:0}: Error finding container e354b6c39760c1c34c880190f6da54945a2a2266d7b7c2ea35aa2db02edaeec7: Status 404 returned error can't find the container with id e354b6c39760c1c34c880190f6da54945a2a2266d7b7c2ea35aa2db02edaeec7 Apr 23 16:37:55.903237 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:55.903216 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-dbbdff94c-xrmkj"] Apr 23 16:37:55.906098 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:37:55.906076 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a538e20_bab6_46fe_9017_54d1d693ba8c.slice/crio-0612b87f973d1d2f8e4975935e1b53c2865f659dbb9e522377ca3da77cdd54ca WatchSource:0}: Error finding container 0612b87f973d1d2f8e4975935e1b53c2865f659dbb9e522377ca3da77cdd54ca: Status 404 returned error can't find the container with id 0612b87f973d1d2f8e4975935e1b53c2865f659dbb9e522377ca3da77cdd54ca Apr 23 16:37:56.031091 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:56.031054 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" event={"ID":"0a538e20-bab6-46fe-9017-54d1d693ba8c","Type":"ContainerStarted","Data":"8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1"} Apr 23 16:37:56.031091 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:56.031094 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" event={"ID":"0a538e20-bab6-46fe-9017-54d1d693ba8c","Type":"ContainerStarted","Data":"0612b87f973d1d2f8e4975935e1b53c2865f659dbb9e522377ca3da77cdd54ca"} Apr 23 16:37:56.031645 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:56.031136 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:37:56.032377 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:56.032343 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zbx8c" event={"ID":"e8a8577c-7241-452c-bed6-2d35076dce94","Type":"ContainerStarted","Data":"e354b6c39760c1c34c880190f6da54945a2a2266d7b7c2ea35aa2db02edaeec7"} Apr 23 16:37:56.033243 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:56.033220 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4vzw2" event={"ID":"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4","Type":"ContainerStarted","Data":"83aa0e086d9571799cb15dcd84d1737a0ee8ef7fb3d6895c5e0a97fcaae7538c"} Apr 23 16:37:56.034712 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:56.034690 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" event={"ID":"adc54c5e-4c8c-4c66-acc3-d8e38c5c074f","Type":"ContainerStarted","Data":"417786364afcda53e5b7fdcbfb3b42f175aa460155a410e9644798c5e816b02b"} Apr 23 16:37:56.034897 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:56.034882 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:37:56.035555 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:56.035538 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-685dd846cd-c4xqt" Apr 23 16:37:56.054226 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:56.054179 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" podStartSLOduration=162.054168716 podStartE2EDuration="2m42.054168716s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:37:56.05334582 +0000 UTC m=+162.114420071" watchObservedRunningTime="2026-04-23 16:37:56.054168716 +0000 UTC m=+162.115242971" Apr 23 16:37:58.041558 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:58.041466 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zbx8c" event={"ID":"e8a8577c-7241-452c-bed6-2d35076dce94","Type":"ContainerStarted","Data":"2f07ec448ad0a309427b03b0f5d2c733d7cb82a360561e69208ed7da0243ef7f"} Apr 23 16:37:58.043106 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:58.043077 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4vzw2" event={"ID":"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4","Type":"ContainerStarted","Data":"86b96b17df93b379da9b3ea683005df13b10acc95b189153f0ab1ffe91b9f49f"} Apr 23 16:37:58.043214 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:58.043114 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4vzw2" event={"ID":"0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4","Type":"ContainerStarted","Data":"f7f90c5a9f658a96a8133b178f595bedaf412629da9aad4cc064de9a07faa029"} Apr 23 16:37:58.064964 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:58.064898 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zbx8c" podStartSLOduration=129.245072 podStartE2EDuration="2m11.064882972s" podCreationTimestamp="2026-04-23 16:35:47 +0000 UTC" firstStartedPulling="2026-04-23 16:37:55.898414247 +0000 UTC m=+161.959488479" lastFinishedPulling="2026-04-23 16:37:57.718225219 +0000 UTC m=+163.779299451" observedRunningTime="2026-04-23 16:37:58.064700765 +0000 UTC m=+164.125775024" watchObservedRunningTime="2026-04-23 16:37:58.064882972 +0000 UTC m=+164.125957226" Apr 23 16:37:58.084644 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:58.084594 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4vzw2" podStartSLOduration=129.598396214 podStartE2EDuration="2m11.084579816s" podCreationTimestamp="2026-04-23 16:35:47 +0000 UTC" firstStartedPulling="2026-04-23 16:37:55.672948295 +0000 UTC m=+161.734022528" lastFinishedPulling="2026-04-23 16:37:57.159131898 +0000 UTC m=+163.220206130" observedRunningTime="2026-04-23 16:37:58.08387391 +0000 UTC m=+164.144948164" watchObservedRunningTime="2026-04-23 16:37:58.084579816 +0000 UTC m=+164.145654070" Apr 23 16:37:59.046049 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:37:59.046017 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4vzw2" Apr 23 16:38:00.777936 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.777901 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk"] Apr 23 16:38:00.780864 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.780847 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk" Apr 23 16:38:00.786099 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.786073 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-4qkzl\"" Apr 23 16:38:00.787601 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.787584 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 16:38:00.818529 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.818498 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk"] Apr 23 16:38:00.846517 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.846485 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-g4cmx"] Apr 23 16:38:00.849451 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.849434 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:00.853181 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.853156 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:38:00.853329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.853157 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:38:00.853329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.853218 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:38:00.853501 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.853485 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:38:00.853564 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.853527 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kd74h\"" Apr 23 16:38:00.860212 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.860193 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-g4cmx"] Apr 23 16:38:00.869042 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.869020 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5e762e20-55d7-4136-9407-bb91ee9cadca-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2svfk\" (UID: \"5e762e20-55d7-4136-9407-bb91ee9cadca\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk" Apr 23 16:38:00.969721 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.969680 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f325d8f-ab8b-445f-8060-6728f5736741-crio-socket\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:00.969721 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.969727 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f325d8f-ab8b-445f-8060-6728f5736741-data-volume\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:00.969972 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.969787 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5e762e20-55d7-4136-9407-bb91ee9cadca-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2svfk\" (UID: \"5e762e20-55d7-4136-9407-bb91ee9cadca\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk" Apr 23 16:38:00.969972 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.969814 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f325d8f-ab8b-445f-8060-6728f5736741-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:00.969972 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.969840 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7nmr\" (UniqueName: \"kubernetes.io/projected/4f325d8f-ab8b-445f-8060-6728f5736741-kube-api-access-v7nmr\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:00.969972 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.969872 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f325d8f-ab8b-445f-8060-6728f5736741-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:00.972543 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:00.972518 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5e762e20-55d7-4136-9407-bb91ee9cadca-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2svfk\" (UID: \"5e762e20-55d7-4136-9407-bb91ee9cadca\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk" Apr 23 16:38:01.071269 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.071185 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f325d8f-ab8b-445f-8060-6728f5736741-crio-socket\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:01.071269 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.071226 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f325d8f-ab8b-445f-8060-6728f5736741-data-volume\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:01.071269 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.071261 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f325d8f-ab8b-445f-8060-6728f5736741-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:01.071518 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.071281 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7nmr\" (UniqueName: \"kubernetes.io/projected/4f325d8f-ab8b-445f-8060-6728f5736741-kube-api-access-v7nmr\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:01.071518 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.071332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f325d8f-ab8b-445f-8060-6728f5736741-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:01.071518 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.071328 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f325d8f-ab8b-445f-8060-6728f5736741-crio-socket\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:01.071697 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.071676 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f325d8f-ab8b-445f-8060-6728f5736741-data-volume\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:01.071914 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.071896 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f325d8f-ab8b-445f-8060-6728f5736741-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:01.073601 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.073584 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f325d8f-ab8b-445f-8060-6728f5736741-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:01.086938 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.086906 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7nmr\" (UniqueName: \"kubernetes.io/projected/4f325d8f-ab8b-445f-8060-6728f5736741-kube-api-access-v7nmr\") pod \"insights-runtime-extractor-g4cmx\" (UID: \"4f325d8f-ab8b-445f-8060-6728f5736741\") " pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:01.088793 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.088774 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk" Apr 23 16:38:01.157776 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.157750 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-g4cmx" Apr 23 16:38:01.209806 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.209780 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk"] Apr 23 16:38:01.212733 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:38:01.212701 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e762e20_55d7_4136_9407_bb91ee9cadca.slice/crio-189a3246f3a542c5e3d9154891f23e151ef9cea193ed269068045d5a2a37ec53 WatchSource:0}: Error finding container 189a3246f3a542c5e3d9154891f23e151ef9cea193ed269068045d5a2a37ec53: Status 404 returned error can't find the container with id 189a3246f3a542c5e3d9154891f23e151ef9cea193ed269068045d5a2a37ec53 Apr 23 16:38:01.292380 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.292329 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-g4cmx"] Apr 23 16:38:01.294754 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:38:01.294717 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f325d8f_ab8b_445f_8060_6728f5736741.slice/crio-284308a66f7b5422dc8d0745977fc2b0533d5007fb307db19585320f0e72ced3 WatchSource:0}: Error finding container 284308a66f7b5422dc8d0745977fc2b0533d5007fb307db19585320f0e72ced3: Status 404 returned error can't find the container with id 284308a66f7b5422dc8d0745977fc2b0533d5007fb307db19585320f0e72ced3 Apr 23 16:38:01.502618 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.502584 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-ph9nc"] Apr 23 16:38:01.506651 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.506630 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ph9nc" Apr 23 16:38:01.509158 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.509135 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-vrpsz\"" Apr 23 16:38:01.509303 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.509272 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 16:38:01.509379 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.509360 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 16:38:01.528465 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.528438 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ph9nc"] Apr 23 16:38:01.574248 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.574215 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8f9p\" (UniqueName: \"kubernetes.io/projected/1148e1b2-bfa3-471a-876a-52bc12750931-kube-api-access-g8f9p\") pod \"downloads-6bcc868b7-ph9nc\" (UID: \"1148e1b2-bfa3-471a-876a-52bc12750931\") " pod="openshift-console/downloads-6bcc868b7-ph9nc" Apr 23 16:38:01.674701 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.674618 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8f9p\" (UniqueName: \"kubernetes.io/projected/1148e1b2-bfa3-471a-876a-52bc12750931-kube-api-access-g8f9p\") pod \"downloads-6bcc868b7-ph9nc\" (UID: \"1148e1b2-bfa3-471a-876a-52bc12750931\") " pod="openshift-console/downloads-6bcc868b7-ph9nc" Apr 23 16:38:01.684163 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.684133 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8f9p\" (UniqueName: \"kubernetes.io/projected/1148e1b2-bfa3-471a-876a-52bc12750931-kube-api-access-g8f9p\") pod \"downloads-6bcc868b7-ph9nc\" (UID: \"1148e1b2-bfa3-471a-876a-52bc12750931\") " pod="openshift-console/downloads-6bcc868b7-ph9nc" Apr 23 16:38:01.815495 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.815189 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ph9nc" Apr 23 16:38:01.962308 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:01.962270 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ph9nc"] Apr 23 16:38:01.964576 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:38:01.964543 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1148e1b2_bfa3_471a_876a_52bc12750931.slice/crio-992fa3076508398f5ac64a7880bbbbe5faf9fe6a9a77dc011810b0c88ebf686e WatchSource:0}: Error finding container 992fa3076508398f5ac64a7880bbbbe5faf9fe6a9a77dc011810b0c88ebf686e: Status 404 returned error can't find the container with id 992fa3076508398f5ac64a7880bbbbe5faf9fe6a9a77dc011810b0c88ebf686e Apr 23 16:38:02.055582 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:02.055539 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ph9nc" event={"ID":"1148e1b2-bfa3-471a-876a-52bc12750931","Type":"ContainerStarted","Data":"992fa3076508398f5ac64a7880bbbbe5faf9fe6a9a77dc011810b0c88ebf686e"} Apr 23 16:38:02.057147 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:02.057119 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g4cmx" event={"ID":"4f325d8f-ab8b-445f-8060-6728f5736741","Type":"ContainerStarted","Data":"b4dc6ccd62fc3220afb760532be4a11addf76ad954976b588dc7de53babaf545"} Apr 23 16:38:02.057284 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:02.057155 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g4cmx" event={"ID":"4f325d8f-ab8b-445f-8060-6728f5736741","Type":"ContainerStarted","Data":"284308a66f7b5422dc8d0745977fc2b0533d5007fb307db19585320f0e72ced3"} Apr 23 16:38:02.058280 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:02.058250 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk" event={"ID":"5e762e20-55d7-4136-9407-bb91ee9cadca","Type":"ContainerStarted","Data":"189a3246f3a542c5e3d9154891f23e151ef9cea193ed269068045d5a2a37ec53"} Apr 23 16:38:02.532555 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:02.532528 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:38:03.062897 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:03.062857 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g4cmx" event={"ID":"4f325d8f-ab8b-445f-8060-6728f5736741","Type":"ContainerStarted","Data":"98b58f423d7d8b017dfdc86c8b7c3247f5cfd18c94efb123bb6807d6bb4cc676"} Apr 23 16:38:03.064365 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:03.064337 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk" event={"ID":"5e762e20-55d7-4136-9407-bb91ee9cadca","Type":"ContainerStarted","Data":"87195d034a6c6fd0d7ea5c4d2ac271119096782109dbbe765ae868fe718f2abc"} Apr 23 16:38:03.064626 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:03.064604 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk" Apr 23 16:38:03.070392 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:03.070359 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk" Apr 23 16:38:03.086711 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:03.086666 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2svfk" podStartSLOduration=2.039491961 podStartE2EDuration="3.08665366s" podCreationTimestamp="2026-04-23 16:38:00 +0000 UTC" firstStartedPulling="2026-04-23 16:38:01.2146823 +0000 UTC m=+167.275756532" lastFinishedPulling="2026-04-23 16:38:02.261843999 +0000 UTC m=+168.322918231" observedRunningTime="2026-04-23 16:38:03.08617907 +0000 UTC m=+169.147253325" watchObservedRunningTime="2026-04-23 16:38:03.08665366 +0000 UTC m=+169.147727945" Apr 23 16:38:05.072092 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:05.072052 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g4cmx" event={"ID":"4f325d8f-ab8b-445f-8060-6728f5736741","Type":"ContainerStarted","Data":"50bc4dc7f95dfc975fc300c72fa80e6ba3576aee77b84be4712f3fe1d39569d1"} Apr 23 16:38:05.095126 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:05.095072 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-g4cmx" podStartSLOduration=2.234334699 podStartE2EDuration="5.095057423s" podCreationTimestamp="2026-04-23 16:38:00 +0000 UTC" firstStartedPulling="2026-04-23 16:38:01.351174059 +0000 UTC m=+167.412248297" lastFinishedPulling="2026-04-23 16:38:04.21189678 +0000 UTC m=+170.272971021" observedRunningTime="2026-04-23 16:38:05.092868603 +0000 UTC m=+171.153942880" watchObservedRunningTime="2026-04-23 16:38:05.095057423 +0000 UTC m=+171.156131747" Apr 23 16:38:09.051355 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:09.051318 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4vzw2" Apr 23 16:38:10.355574 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.355539 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb"] Apr 23 16:38:10.359797 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.359325 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.367497 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.366967 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:38:10.367497 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.367202 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 16:38:10.371665 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.369114 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-lgkrw\"" Apr 23 16:38:10.371665 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.369144 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 16:38:10.371665 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.369793 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb"] Apr 23 16:38:10.371665 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.369284 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:38:10.371665 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.370497 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:38:10.385113 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.385069 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bx66h"] Apr 23 16:38:10.388452 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.388384 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.391497 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.391479 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:38:10.391929 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.391910 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-292nf\"" Apr 23 16:38:10.392214 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.392192 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:38:10.392383 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.392198 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:38:10.448196 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448078 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.448196 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448118 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.448196 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448148 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-wtmp\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.448196 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448172 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cs7c\" (UniqueName: \"kubernetes.io/projected/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-kube-api-access-8cs7c\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.448826 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448238 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5fc385bb-6168-4cf7-9f6f-3a7d17005621-root\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.448826 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448270 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.448826 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448319 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-tls\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.448826 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448394 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7qdv\" (UniqueName: \"kubernetes.io/projected/5fc385bb-6168-4cf7-9f6f-3a7d17005621-kube-api-access-n7qdv\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.448826 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448500 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-accelerators-collector-config\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.449068 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448847 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fc385bb-6168-4cf7-9f6f-3a7d17005621-metrics-client-ca\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.449068 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448894 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-textfile\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.449068 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448929 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.449068 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.448958 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5fc385bb-6168-4cf7-9f6f-3a7d17005621-sys\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550198 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550158 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-textfile\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550211 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.550396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550241 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5fc385bb-6168-4cf7-9f6f-3a7d17005621-sys\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550271 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.550396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550351 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5fc385bb-6168-4cf7-9f6f-3a7d17005621-sys\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550615 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550402 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.550615 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550440 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-wtmp\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550615 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550468 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cs7c\" (UniqueName: \"kubernetes.io/projected/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-kube-api-access-8cs7c\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.550615 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550515 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5fc385bb-6168-4cf7-9f6f-3a7d17005621-root\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550615 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550545 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550615 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550574 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-tls\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550615 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550599 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qdv\" (UniqueName: \"kubernetes.io/projected/5fc385bb-6168-4cf7-9f6f-3a7d17005621-kube-api-access-n7qdv\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550615 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550606 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-textfile\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550641 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-accelerators-collector-config\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550676 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5fc385bb-6168-4cf7-9f6f-3a7d17005621-root\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550696 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fc385bb-6168-4cf7-9f6f-3a7d17005621-metrics-client-ca\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.550981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550782 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-wtmp\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.551191 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.550983 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.551249 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.551229 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fc385bb-6168-4cf7-9f6f-3a7d17005621-metrics-client-ca\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.552014 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.551937 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-accelerators-collector-config\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.553591 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.553567 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-tls\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.553785 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.553764 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.554386 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.554362 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.554477 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.554399 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5fc385bb-6168-4cf7-9f6f-3a7d17005621-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.561875 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.561847 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qdv\" (UniqueName: \"kubernetes.io/projected/5fc385bb-6168-4cf7-9f6f-3a7d17005621-kube-api-access-n7qdv\") pod \"node-exporter-bx66h\" (UID: \"5fc385bb-6168-4cf7-9f6f-3a7d17005621\") " pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.562163 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.562139 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cs7c\" (UniqueName: \"kubernetes.io/projected/61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d-kube-api-access-8cs7c\") pod \"openshift-state-metrics-9d44df66c-lkkzb\" (UID: \"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.674458 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.674379 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" Apr 23 16:38:10.700187 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.700148 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bx66h" Apr 23 16:38:10.964408 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.964324 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59b4688497-tvbfp"] Apr 23 16:38:10.968719 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.968685 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:10.973062 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.973032 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 16:38:10.973062 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.973052 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-kpqph\"" Apr 23 16:38:10.973236 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.973085 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 16:38:10.973795 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.973771 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 16:38:10.973920 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.973870 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 16:38:10.974239 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.974148 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 16:38:10.981263 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:10.981241 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59b4688497-tvbfp"] Apr 23 16:38:11.055070 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.055035 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-oauth-config\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.055247 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.055086 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-service-ca\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.055247 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.055115 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-oauth-serving-cert\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.055247 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.055230 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-serving-cert\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.055427 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.055273 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-config\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.055427 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.055308 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52wzt\" (UniqueName: \"kubernetes.io/projected/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-kube-api-access-52wzt\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.156476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.156442 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-oauth-config\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.156677 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.156486 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-service-ca\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.156677 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.156506 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-oauth-serving-cert\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.156677 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.156565 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-serving-cert\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.156677 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.156597 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-config\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.156677 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.156624 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52wzt\" (UniqueName: \"kubernetes.io/projected/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-kube-api-access-52wzt\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.157306 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.157261 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-service-ca\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.157503 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.157480 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-oauth-serving-cert\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.157580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.157486 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-config\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.159490 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.159468 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-oauth-config\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.159490 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.159486 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-serving-cert\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.167287 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.167264 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52wzt\" (UniqueName: \"kubernetes.io/projected/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-kube-api-access-52wzt\") pod \"console-59b4688497-tvbfp\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:11.281042 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:11.281001 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:12.476672 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.476639 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-f7784b7cd-krvh2"] Apr 23 16:38:12.481477 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.481451 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.484147 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.484119 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 16:38:12.484811 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.484788 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-d1nom4pgs9vis\"" Apr 23 16:38:12.484936 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.484868 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-wnsrm\"" Apr 23 16:38:12.484936 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.484889 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 16:38:12.485485 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.485467 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 16:38:12.485874 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.485853 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 16:38:12.485874 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.485872 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 16:38:12.493244 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.493226 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f7784b7cd-krvh2"] Apr 23 16:38:12.566724 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.566687 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.566919 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.566752 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.566919 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.566785 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-tls\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.566919 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.566813 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f8p9\" (UniqueName: \"kubernetes.io/projected/3cbcad22-6cfd-4816-a73c-152549b91eeb-kube-api-access-2f8p9\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.566919 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.566908 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.567126 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.566983 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3cbcad22-6cfd-4816-a73c-152549b91eeb-metrics-client-ca\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.567126 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.567017 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.567126 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.567049 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-grpc-tls\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.668120 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.668076 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.668120 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.668120 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-tls\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.668381 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.668148 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2f8p9\" (UniqueName: \"kubernetes.io/projected/3cbcad22-6cfd-4816-a73c-152549b91eeb-kube-api-access-2f8p9\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.668381 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.668196 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.668381 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.668242 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3cbcad22-6cfd-4816-a73c-152549b91eeb-metrics-client-ca\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.668381 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.668271 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.668381 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.668315 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-grpc-tls\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.668631 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.668393 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.669487 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.669420 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3cbcad22-6cfd-4816-a73c-152549b91eeb-metrics-client-ca\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.671393 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.671345 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.671568 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.671418 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.671568 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.671476 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-grpc-tls\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.671877 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.671854 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-tls\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.671993 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.671964 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.672159 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.672139 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3cbcad22-6cfd-4816-a73c-152549b91eeb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.677742 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.677723 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f8p9\" (UniqueName: \"kubernetes.io/projected/3cbcad22-6cfd-4816-a73c-152549b91eeb-kube-api-access-2f8p9\") pod \"thanos-querier-f7784b7cd-krvh2\" (UID: \"3cbcad22-6cfd-4816-a73c-152549b91eeb\") " pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:12.794037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:12.793959 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:15.121855 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:15.121797 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv"] Apr 23 16:38:15.126415 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:15.126387 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv" Apr 23 16:38:15.129773 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:15.129038 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-v2fkv\"" Apr 23 16:38:15.129773 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:15.129332 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 16:38:15.132506 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:15.132480 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv"] Apr 23 16:38:15.191589 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:15.191551 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/86ae9c55-d565-4339-b6c1-9cb99a952b70-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-58gvv\" (UID: \"86ae9c55-d565-4339-b6c1-9cb99a952b70\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv" Apr 23 16:38:15.292991 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:15.292950 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/86ae9c55-d565-4339-b6c1-9cb99a952b70-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-58gvv\" (UID: \"86ae9c55-d565-4339-b6c1-9cb99a952b70\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv" Apr 23 16:38:15.293188 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:38:15.293101 2580 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 23 16:38:15.293188 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:38:15.293176 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86ae9c55-d565-4339-b6c1-9cb99a952b70-monitoring-plugin-cert podName:86ae9c55-d565-4339-b6c1-9cb99a952b70 nodeName:}" failed. No retries permitted until 2026-04-23 16:38:15.793155446 +0000 UTC m=+181.854229678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/86ae9c55-d565-4339-b6c1-9cb99a952b70-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-58gvv" (UID: "86ae9c55-d565-4339-b6c1-9cb99a952b70") : secret "monitoring-plugin-cert" not found Apr 23 16:38:15.533050 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:15.533005 2580 patch_prober.go:28] interesting pod/image-registry-dbbdff94c-xrmkj container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 16:38:15.533223 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:15.533068 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" podUID="0a538e20-bab6-46fe-9017-54d1d693ba8c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:38:15.798333 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:15.798228 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/86ae9c55-d565-4339-b6c1-9cb99a952b70-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-58gvv\" (UID: \"86ae9c55-d565-4339-b6c1-9cb99a952b70\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv" Apr 23 16:38:15.801035 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:15.801008 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/86ae9c55-d565-4339-b6c1-9cb99a952b70-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-58gvv\" (UID: \"86ae9c55-d565-4339-b6c1-9cb99a952b70\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv" Apr 23 16:38:16.038915 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:16.038880 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv" Apr 23 16:38:17.044280 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:17.044245 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:38:17.622886 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:17.622146 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb"] Apr 23 16:38:17.623889 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:38:17.623858 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61ecdfb7_6d32_488b_b3b1_99d8f7adbe0d.slice/crio-7cd2d20ba20da01717b8982bf0bd4a0996f3e5a4f30211462d6446b2d9c5341d WatchSource:0}: Error finding container 7cd2d20ba20da01717b8982bf0bd4a0996f3e5a4f30211462d6446b2d9c5341d: Status 404 returned error can't find the container with id 7cd2d20ba20da01717b8982bf0bd4a0996f3e5a4f30211462d6446b2d9c5341d Apr 23 16:38:17.640160 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:17.639832 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv"] Apr 23 16:38:17.644258 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:38:17.644227 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86ae9c55_d565_4339_b6c1_9cb99a952b70.slice/crio-0a8cb96418e4e2378f82c9595d19929075d6a03f5ba3f7a10eec33042d4f0a96 WatchSource:0}: Error finding container 0a8cb96418e4e2378f82c9595d19929075d6a03f5ba3f7a10eec33042d4f0a96: Status 404 returned error can't find the container with id 0a8cb96418e4e2378f82c9595d19929075d6a03f5ba3f7a10eec33042d4f0a96 Apr 23 16:38:17.872165 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:17.872131 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59b4688497-tvbfp"] Apr 23 16:38:17.875190 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:17.875149 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f7784b7cd-krvh2"] Apr 23 16:38:17.875583 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:38:17.875537 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1e2c330_41e1_48b4_b81e_b9cfd0556afd.slice/crio-0a928a039c51307d96b35168f04e893ae570528f71ad31ee3f51c762517ccfb5 WatchSource:0}: Error finding container 0a928a039c51307d96b35168f04e893ae570528f71ad31ee3f51c762517ccfb5: Status 404 returned error can't find the container with id 0a928a039c51307d96b35168f04e893ae570528f71ad31ee3f51c762517ccfb5 Apr 23 16:38:17.878246 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:38:17.878079 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cbcad22_6cfd_4816_a73c_152549b91eeb.slice/crio-3d13bdd4a6ce63f76c6082c04755a9abf79147010416673ded323dc371c4eb2d WatchSource:0}: Error finding container 3d13bdd4a6ce63f76c6082c04755a9abf79147010416673ded323dc371c4eb2d: Status 404 returned error can't find the container with id 3d13bdd4a6ce63f76c6082c04755a9abf79147010416673ded323dc371c4eb2d Apr 23 16:38:18.113336 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:18.113233 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59b4688497-tvbfp" event={"ID":"f1e2c330-41e1-48b4-b81e-b9cfd0556afd","Type":"ContainerStarted","Data":"0a928a039c51307d96b35168f04e893ae570528f71ad31ee3f51c762517ccfb5"} Apr 23 16:38:18.114976 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:18.114948 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx66h" event={"ID":"5fc385bb-6168-4cf7-9f6f-3a7d17005621","Type":"ContainerStarted","Data":"600dd1e6fb4fc3951a4a218e3d672ef7b1f7983e8aad80ca833172d0ac5a2784"} Apr 23 16:38:18.116734 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:18.116707 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv" event={"ID":"86ae9c55-d565-4339-b6c1-9cb99a952b70","Type":"ContainerStarted","Data":"0a8cb96418e4e2378f82c9595d19929075d6a03f5ba3f7a10eec33042d4f0a96"} Apr 23 16:38:18.119407 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:18.119376 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ph9nc" event={"ID":"1148e1b2-bfa3-471a-876a-52bc12750931","Type":"ContainerStarted","Data":"b80a5f331fcd5852110b69a865a0ff8b45ec3bdee458229a9a2567a0446b0563"} Apr 23 16:38:18.119928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:18.119880 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-ph9nc" Apr 23 16:38:18.122313 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:18.122273 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" event={"ID":"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d","Type":"ContainerStarted","Data":"89d1d86a005614b7c99e249153bb95eee24944dad7058448ed402217cd1d9120"} Apr 23 16:38:18.122488 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:18.122319 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" event={"ID":"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d","Type":"ContainerStarted","Data":"4f83ee8b7dfa66272a273a37cd8c41b49f6288ab5785fd01be458f55df45d4e4"} Apr 23 16:38:18.122488 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:18.122334 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" event={"ID":"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d","Type":"ContainerStarted","Data":"7cd2d20ba20da01717b8982bf0bd4a0996f3e5a4f30211462d6446b2d9c5341d"} Apr 23 16:38:18.123989 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:18.123965 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" event={"ID":"3cbcad22-6cfd-4816-a73c-152549b91eeb","Type":"ContainerStarted","Data":"3d13bdd4a6ce63f76c6082c04755a9abf79147010416673ded323dc371c4eb2d"} Apr 23 16:38:18.132413 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:18.132384 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-ph9nc" Apr 23 16:38:18.167987 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:18.167686 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-ph9nc" podStartSLOduration=1.5545309189999998 podStartE2EDuration="17.167667846s" podCreationTimestamp="2026-04-23 16:38:01 +0000 UTC" firstStartedPulling="2026-04-23 16:38:01.966526132 +0000 UTC m=+168.027600367" lastFinishedPulling="2026-04-23 16:38:17.579663048 +0000 UTC m=+183.640737294" observedRunningTime="2026-04-23 16:38:18.145609664 +0000 UTC m=+184.206683939" watchObservedRunningTime="2026-04-23 16:38:18.167667846 +0000 UTC m=+184.228742104" Apr 23 16:38:19.135803 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:19.135604 2580 generic.go:358] "Generic (PLEG): container finished" podID="5fc385bb-6168-4cf7-9f6f-3a7d17005621" containerID="401d43a2e7119a944b9bdcaa9caa7c74930c268a7f2caf03d414000c315c58ae" exitCode=0 Apr 23 16:38:19.136391 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:19.136359 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx66h" event={"ID":"5fc385bb-6168-4cf7-9f6f-3a7d17005621","Type":"ContainerDied","Data":"401d43a2e7119a944b9bdcaa9caa7c74930c268a7f2caf03d414000c315c58ae"} Apr 23 16:38:20.741191 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.741110 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dfd99f556-drgmr"] Apr 23 16:38:20.769257 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.769221 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dfd99f556-drgmr"] Apr 23 16:38:20.769468 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.769387 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.779425 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.779397 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 16:38:20.847072 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.847036 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-oauth-serving-cert\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.847344 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.847099 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-serving-cert\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.847344 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.847202 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-trusted-ca-bundle\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.847344 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.847230 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxw4z\" (UniqueName: \"kubernetes.io/projected/5a031aac-7d65-4acb-845d-f7a232e069e7-kube-api-access-zxw4z\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.847344 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.847264 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-console-config\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.847680 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.847428 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-service-ca\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.847680 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.847475 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-oauth-config\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.948567 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.948480 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-console-config\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.948567 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.948530 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-service-ca\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.948567 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.948566 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-oauth-config\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.948849 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.948599 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-oauth-serving-cert\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.948849 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.948645 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-serving-cert\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.948849 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.948745 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-trusted-ca-bundle\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.948849 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.948772 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxw4z\" (UniqueName: \"kubernetes.io/projected/5a031aac-7d65-4acb-845d-f7a232e069e7-kube-api-access-zxw4z\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.949325 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.949275 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-console-config\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.949457 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.949375 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-oauth-serving-cert\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.950185 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.950149 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-service-ca\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.950895 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.950873 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-trusted-ca-bundle\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.951909 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.951886 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-oauth-config\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.952617 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.952594 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-serving-cert\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:20.960196 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:20.960173 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxw4z\" (UniqueName: \"kubernetes.io/projected/5a031aac-7d65-4acb-845d-f7a232e069e7-kube-api-access-zxw4z\") pod \"console-6dfd99f556-drgmr\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:21.082051 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:21.082006 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:22.672323 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:22.671801 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dfd99f556-drgmr"] Apr 23 16:38:22.676704 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:38:22.676631 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a031aac_7d65_4acb_845d_f7a232e069e7.slice/crio-f4fdafd5a9c1cffbf58c720b1c3e1588927b33047d8d493b3ce8899ea5a7f1f9 WatchSource:0}: Error finding container f4fdafd5a9c1cffbf58c720b1c3e1588927b33047d8d493b3ce8899ea5a7f1f9: Status 404 returned error can't find the container with id f4fdafd5a9c1cffbf58c720b1c3e1588927b33047d8d493b3ce8899ea5a7f1f9 Apr 23 16:38:23.150382 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.150341 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" event={"ID":"3cbcad22-6cfd-4816-a73c-152549b91eeb","Type":"ContainerStarted","Data":"39dac1cffbd14f22d458dc388f6e723627fe23c15a8ff9059cb1a5aacbe50b75"} Apr 23 16:38:23.150382 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.150387 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" event={"ID":"3cbcad22-6cfd-4816-a73c-152549b91eeb","Type":"ContainerStarted","Data":"cc163ba1c104347800e735724c5c8a8c30474b8b34e469e83fc3d0c27a076976"} Apr 23 16:38:23.150603 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.150403 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" event={"ID":"3cbcad22-6cfd-4816-a73c-152549b91eeb","Type":"ContainerStarted","Data":"c35f7b28b8b1c5353d79ad029290b195ea2d2b8548990dc842b921f2592cda13"} Apr 23 16:38:23.151870 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.151841 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv" event={"ID":"86ae9c55-d565-4339-b6c1-9cb99a952b70","Type":"ContainerStarted","Data":"a882d5d2aecf933fe61dbb54e11dbdb3c0f339f425b716ff6499fc563bdba04a"} Apr 23 16:38:23.152189 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.152170 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv" Apr 23 16:38:23.154083 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.154045 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" event={"ID":"61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d","Type":"ContainerStarted","Data":"216552c1d8f4fcb021f0f001f7b64a02286cc949a0a9472301559858bc2afbcf"} Apr 23 16:38:23.155725 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.155701 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dfd99f556-drgmr" event={"ID":"5a031aac-7d65-4acb-845d-f7a232e069e7","Type":"ContainerStarted","Data":"ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827"} Apr 23 16:38:23.155817 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.155730 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dfd99f556-drgmr" event={"ID":"5a031aac-7d65-4acb-845d-f7a232e069e7","Type":"ContainerStarted","Data":"f4fdafd5a9c1cffbf58c720b1c3e1588927b33047d8d493b3ce8899ea5a7f1f9"} Apr 23 16:38:23.157613 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.157585 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59b4688497-tvbfp" event={"ID":"f1e2c330-41e1-48b4-b81e-b9cfd0556afd","Type":"ContainerStarted","Data":"403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523"} Apr 23 16:38:23.158152 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.158132 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv" Apr 23 16:38:23.159832 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.159809 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx66h" event={"ID":"5fc385bb-6168-4cf7-9f6f-3a7d17005621","Type":"ContainerStarted","Data":"3de59ca818d6b78f771ac9777597cd38853aa4223b9b16132f21d35b1c7bd58d"} Apr 23 16:38:23.159930 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.159837 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx66h" event={"ID":"5fc385bb-6168-4cf7-9f6f-3a7d17005621","Type":"ContainerStarted","Data":"2644fe3f834ad868c2b4d4c529c7465c3e746e5d559760d89bf7ec02f7e702dc"} Apr 23 16:38:23.174356 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.174306 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-58gvv" podStartSLOduration=3.836412352 podStartE2EDuration="8.174276002s" podCreationTimestamp="2026-04-23 16:38:15 +0000 UTC" firstStartedPulling="2026-04-23 16:38:17.64633243 +0000 UTC m=+183.707406665" lastFinishedPulling="2026-04-23 16:38:21.984196081 +0000 UTC m=+188.045270315" observedRunningTime="2026-04-23 16:38:23.170943504 +0000 UTC m=+189.232017796" watchObservedRunningTime="2026-04-23 16:38:23.174276002 +0000 UTC m=+189.235350258" Apr 23 16:38:23.203350 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.203280 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bx66h" podStartSLOduration=12.434191647 podStartE2EDuration="13.203263818s" podCreationTimestamp="2026-04-23 16:38:10 +0000 UTC" firstStartedPulling="2026-04-23 16:38:17.50231253 +0000 UTC m=+183.563386768" lastFinishedPulling="2026-04-23 16:38:18.2713847 +0000 UTC m=+184.332458939" observedRunningTime="2026-04-23 16:38:23.201811492 +0000 UTC m=+189.262885748" watchObservedRunningTime="2026-04-23 16:38:23.203263818 +0000 UTC m=+189.264338073" Apr 23 16:38:23.294947 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.294897 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lkkzb" podStartSLOduration=8.635332598 podStartE2EDuration="13.294882991s" podCreationTimestamp="2026-04-23 16:38:10 +0000 UTC" firstStartedPulling="2026-04-23 16:38:17.780040092 +0000 UTC m=+183.841114331" lastFinishedPulling="2026-04-23 16:38:22.439590492 +0000 UTC m=+188.500664724" observedRunningTime="2026-04-23 16:38:23.264278606 +0000 UTC m=+189.325352872" watchObservedRunningTime="2026-04-23 16:38:23.294882991 +0000 UTC m=+189.355957245" Apr 23 16:38:23.295114 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.295045 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dfd99f556-drgmr" podStartSLOduration=3.2950382559999998 podStartE2EDuration="3.295038256s" podCreationTimestamp="2026-04-23 16:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:38:23.294151711 +0000 UTC m=+189.355226006" watchObservedRunningTime="2026-04-23 16:38:23.295038256 +0000 UTC m=+189.356112511" Apr 23 16:38:23.323097 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.323038 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59b4688497-tvbfp" podStartSLOduration=8.704004819 podStartE2EDuration="13.323018372s" podCreationTimestamp="2026-04-23 16:38:10 +0000 UTC" firstStartedPulling="2026-04-23 16:38:17.878380015 +0000 UTC m=+183.939454261" lastFinishedPulling="2026-04-23 16:38:22.497393577 +0000 UTC m=+188.558467814" observedRunningTime="2026-04-23 16:38:23.321377879 +0000 UTC m=+189.382452143" watchObservedRunningTime="2026-04-23 16:38:23.323018372 +0000 UTC m=+189.384092624" Apr 23 16:38:23.369825 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:23.369789 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-dbbdff94c-xrmkj"] Apr 23 16:38:25.170221 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:25.170174 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" event={"ID":"3cbcad22-6cfd-4816-a73c-152549b91eeb","Type":"ContainerStarted","Data":"a8e43b714215be60a1434275935f2a41a7e15f5a13791db4ede5ef6d1f9f361b"} Apr 23 16:38:25.170833 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:25.170228 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" event={"ID":"3cbcad22-6cfd-4816-a73c-152549b91eeb","Type":"ContainerStarted","Data":"5df0b23cd724f140a718651cc89fd0f7faea789c9674e69746f33f68591958e9"} Apr 23 16:38:25.170833 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:25.170246 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" event={"ID":"3cbcad22-6cfd-4816-a73c-152549b91eeb","Type":"ContainerStarted","Data":"7d86f72dd6f2df7c06f5c7d87f9801d522ffb805f102b52975510b155dbb9031"} Apr 23 16:38:25.198989 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:25.198939 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" podStartSLOduration=6.925434291 podStartE2EDuration="13.198923507s" podCreationTimestamp="2026-04-23 16:38:12 +0000 UTC" firstStartedPulling="2026-04-23 16:38:17.880270543 +0000 UTC m=+183.941344779" lastFinishedPulling="2026-04-23 16:38:24.153759749 +0000 UTC m=+190.214833995" observedRunningTime="2026-04-23 16:38:25.197590815 +0000 UTC m=+191.258665079" watchObservedRunningTime="2026-04-23 16:38:25.198923507 +0000 UTC m=+191.259997758" Apr 23 16:38:26.173331 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:26.173281 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:27.182566 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:27.182538 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-f7784b7cd-krvh2" Apr 23 16:38:31.082762 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:31.082724 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:31.082762 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:31.082765 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:31.087624 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:31.087599 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:31.190480 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:31.190450 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:38:31.249246 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:31.249020 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59b4688497-tvbfp"] Apr 23 16:38:31.282114 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:31.282076 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:48.393558 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.393517 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" podUID="0a538e20-bab6-46fe-9017-54d1d693ba8c" containerName="registry" containerID="cri-o://8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1" gracePeriod=30 Apr 23 16:38:48.647789 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.647732 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:38:48.686528 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.686469 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") pod \"0a538e20-bab6-46fe-9017-54d1d693ba8c\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " Apr 23 16:38:48.686528 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.686530 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-image-registry-private-configuration\") pod \"0a538e20-bab6-46fe-9017-54d1d693ba8c\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " Apr 23 16:38:48.686805 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.686581 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-trusted-ca\") pod \"0a538e20-bab6-46fe-9017-54d1d693ba8c\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " Apr 23 16:38:48.686805 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.686608 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-certificates\") pod \"0a538e20-bab6-46fe-9017-54d1d693ba8c\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " Apr 23 16:38:48.686805 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.686658 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsk6d\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-kube-api-access-rsk6d\") pod \"0a538e20-bab6-46fe-9017-54d1d693ba8c\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " Apr 23 16:38:48.686805 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.686712 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-installation-pull-secrets\") pod \"0a538e20-bab6-46fe-9017-54d1d693ba8c\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " Apr 23 16:38:48.686805 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.686735 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-bound-sa-token\") pod \"0a538e20-bab6-46fe-9017-54d1d693ba8c\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " Apr 23 16:38:48.686805 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.686762 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a538e20-bab6-46fe-9017-54d1d693ba8c-ca-trust-extracted\") pod \"0a538e20-bab6-46fe-9017-54d1d693ba8c\" (UID: \"0a538e20-bab6-46fe-9017-54d1d693ba8c\") " Apr 23 16:38:48.687118 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.687047 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0a538e20-bab6-46fe-9017-54d1d693ba8c" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:48.687443 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.687413 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0a538e20-bab6-46fe-9017-54d1d693ba8c" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:48.689642 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.689590 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0a538e20-bab6-46fe-9017-54d1d693ba8c" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:48.689780 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.689741 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0a538e20-bab6-46fe-9017-54d1d693ba8c" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:48.689874 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.689850 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0a538e20-bab6-46fe-9017-54d1d693ba8c" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:48.690318 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.690274 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0a538e20-bab6-46fe-9017-54d1d693ba8c" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:48.690625 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.690591 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-kube-api-access-rsk6d" (OuterVolumeSpecName: "kube-api-access-rsk6d") pod "0a538e20-bab6-46fe-9017-54d1d693ba8c" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c"). InnerVolumeSpecName "kube-api-access-rsk6d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:48.697681 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.697654 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a538e20-bab6-46fe-9017-54d1d693ba8c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0a538e20-bab6-46fe-9017-54d1d693ba8c" (UID: "0a538e20-bab6-46fe-9017-54d1d693ba8c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:38:48.788126 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.788088 2580 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-image-registry-private-configuration\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:48.788126 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.788121 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-trusted-ca\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:48.788126 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.788133 2580 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-certificates\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:48.788396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.788143 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rsk6d\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-kube-api-access-rsk6d\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:48.788396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.788152 2580 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a538e20-bab6-46fe-9017-54d1d693ba8c-installation-pull-secrets\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:48.788396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.788161 2580 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-bound-sa-token\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:48.788396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.788169 2580 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a538e20-bab6-46fe-9017-54d1d693ba8c-ca-trust-extracted\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:48.788396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:48.788179 2580 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a538e20-bab6-46fe-9017-54d1d693ba8c-registry-tls\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:49.240047 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:49.240007 2580 generic.go:358] "Generic (PLEG): container finished" podID="0a538e20-bab6-46fe-9017-54d1d693ba8c" containerID="8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1" exitCode=0 Apr 23 16:38:49.240222 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:49.240101 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" Apr 23 16:38:49.240222 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:49.240104 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" event={"ID":"0a538e20-bab6-46fe-9017-54d1d693ba8c","Type":"ContainerDied","Data":"8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1"} Apr 23 16:38:49.240222 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:49.240159 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-dbbdff94c-xrmkj" event={"ID":"0a538e20-bab6-46fe-9017-54d1d693ba8c","Type":"ContainerDied","Data":"0612b87f973d1d2f8e4975935e1b53c2865f659dbb9e522377ca3da77cdd54ca"} Apr 23 16:38:49.240222 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:49.240180 2580 scope.go:117] "RemoveContainer" containerID="8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1" Apr 23 16:38:49.248537 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:49.248522 2580 scope.go:117] "RemoveContainer" containerID="8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1" Apr 23 16:38:49.248779 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:38:49.248756 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1\": container with ID starting with 8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1 not found: ID does not exist" containerID="8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1" Apr 23 16:38:49.248826 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:49.248787 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1"} err="failed to get container status \"8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1\": rpc error: code = NotFound desc = could not find container \"8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1\": container with ID starting with 8a6d8b85e1b6b80f855f2a38e5aec1fc6632ae314aa1ce5bbb73ea8efd5520b1 not found: ID does not exist" Apr 23 16:38:49.263081 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:49.263060 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-dbbdff94c-xrmkj"] Apr 23 16:38:49.265576 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:49.265556 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-dbbdff94c-xrmkj"] Apr 23 16:38:50.537146 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:50.537112 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a538e20-bab6-46fe-9017-54d1d693ba8c" path="/var/lib/kubelet/pods/0a538e20-bab6-46fe-9017-54d1d693ba8c/volumes" Apr 23 16:38:56.267637 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.267599 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59b4688497-tvbfp" podUID="f1e2c330-41e1-48b4-b81e-b9cfd0556afd" containerName="console" containerID="cri-o://403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523" gracePeriod=15 Apr 23 16:38:56.520168 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.520111 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59b4688497-tvbfp_f1e2c330-41e1-48b4-b81e-b9cfd0556afd/console/0.log" Apr 23 16:38:56.520304 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.520172 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:56.651959 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.651920 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-oauth-serving-cert\") pod \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " Apr 23 16:38:56.652137 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.651968 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52wzt\" (UniqueName: \"kubernetes.io/projected/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-kube-api-access-52wzt\") pod \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " Apr 23 16:38:56.652137 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.651995 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-config\") pod \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " Apr 23 16:38:56.652137 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.652026 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-service-ca\") pod \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " Apr 23 16:38:56.652137 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.652065 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-oauth-config\") pod \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " Apr 23 16:38:56.652137 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.652091 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-serving-cert\") pod \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\" (UID: \"f1e2c330-41e1-48b4-b81e-b9cfd0556afd\") " Apr 23 16:38:56.652461 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.652433 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f1e2c330-41e1-48b4-b81e-b9cfd0556afd" (UID: "f1e2c330-41e1-48b4-b81e-b9cfd0556afd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:56.652526 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.652475 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-config" (OuterVolumeSpecName: "console-config") pod "f1e2c330-41e1-48b4-b81e-b9cfd0556afd" (UID: "f1e2c330-41e1-48b4-b81e-b9cfd0556afd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:56.652571 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.652513 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-service-ca" (OuterVolumeSpecName: "service-ca") pod "f1e2c330-41e1-48b4-b81e-b9cfd0556afd" (UID: "f1e2c330-41e1-48b4-b81e-b9cfd0556afd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:56.654316 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.654255 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f1e2c330-41e1-48b4-b81e-b9cfd0556afd" (UID: "f1e2c330-41e1-48b4-b81e-b9cfd0556afd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:56.654426 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.654306 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f1e2c330-41e1-48b4-b81e-b9cfd0556afd" (UID: "f1e2c330-41e1-48b4-b81e-b9cfd0556afd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:56.654426 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.654308 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-kube-api-access-52wzt" (OuterVolumeSpecName: "kube-api-access-52wzt") pod "f1e2c330-41e1-48b4-b81e-b9cfd0556afd" (UID: "f1e2c330-41e1-48b4-b81e-b9cfd0556afd"). InnerVolumeSpecName "kube-api-access-52wzt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:56.753588 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.753552 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-oauth-serving-cert\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:56.753588 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.753582 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-52wzt\" (UniqueName: \"kubernetes.io/projected/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-kube-api-access-52wzt\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:56.753588 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.753594 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-config\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:56.753828 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.753603 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-service-ca\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:56.753828 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.753612 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-oauth-config\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:56.753828 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:56.753620 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1e2c330-41e1-48b4-b81e-b9cfd0556afd-console-serving-cert\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:38:57.267567 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:57.267538 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59b4688497-tvbfp_f1e2c330-41e1-48b4-b81e-b9cfd0556afd/console/0.log" Apr 23 16:38:57.267729 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:57.267581 2580 generic.go:358] "Generic (PLEG): container finished" podID="f1e2c330-41e1-48b4-b81e-b9cfd0556afd" containerID="403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523" exitCode=2 Apr 23 16:38:57.267729 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:57.267636 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59b4688497-tvbfp" event={"ID":"f1e2c330-41e1-48b4-b81e-b9cfd0556afd","Type":"ContainerDied","Data":"403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523"} Apr 23 16:38:57.267729 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:57.267664 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59b4688497-tvbfp" event={"ID":"f1e2c330-41e1-48b4-b81e-b9cfd0556afd","Type":"ContainerDied","Data":"0a928a039c51307d96b35168f04e893ae570528f71ad31ee3f51c762517ccfb5"} Apr 23 16:38:57.267729 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:57.267674 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59b4688497-tvbfp" Apr 23 16:38:57.268100 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:57.267682 2580 scope.go:117] "RemoveContainer" containerID="403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523" Apr 23 16:38:57.276009 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:57.275993 2580 scope.go:117] "RemoveContainer" containerID="403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523" Apr 23 16:38:57.276251 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:38:57.276227 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523\": container with ID starting with 403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523 not found: ID does not exist" containerID="403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523" Apr 23 16:38:57.276319 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:57.276251 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523"} err="failed to get container status \"403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523\": rpc error: code = NotFound desc = could not find container \"403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523\": container with ID starting with 403b6b552dd9a8455593f7ff54b31ad920510abb7a5f05edfbba3eeda1352523 not found: ID does not exist" Apr 23 16:38:57.289481 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:57.289459 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59b4688497-tvbfp"] Apr 23 16:38:57.293371 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:57.293349 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59b4688497-tvbfp"] Apr 23 16:38:58.536977 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:38:58.536944 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1e2c330-41e1-48b4-b81e-b9cfd0556afd" path="/var/lib/kubelet/pods/f1e2c330-41e1-48b4-b81e-b9cfd0556afd/volumes" Apr 23 16:39:08.299793 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:08.299759 2580 generic.go:358] "Generic (PLEG): container finished" podID="45986c9e-a457-4f12-b928-2ef7295dbf7a" containerID="8953babe3d006d26aba5555b2cfbe4d21d1595e5b4cd2bd1b35e92bb4edb55a8" exitCode=0 Apr 23 16:39:08.300236 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:08.299800 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" event={"ID":"45986c9e-a457-4f12-b928-2ef7295dbf7a","Type":"ContainerDied","Data":"8953babe3d006d26aba5555b2cfbe4d21d1595e5b4cd2bd1b35e92bb4edb55a8"} Apr 23 16:39:08.300236 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:08.300087 2580 scope.go:117] "RemoveContainer" containerID="8953babe3d006d26aba5555b2cfbe4d21d1595e5b4cd2bd1b35e92bb4edb55a8" Apr 23 16:39:09.305946 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:09.305911 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rq7kn" event={"ID":"45986c9e-a457-4f12-b928-2ef7295dbf7a","Type":"ContainerStarted","Data":"eedb9ce1bb79b7e3a219a47b24277012a482d799f06be4a8543c11ffd9955b04"} Apr 23 16:39:09.307146 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:09.307120 2580 generic.go:358] "Generic (PLEG): container finished" podID="046b94e7-f586-4a7f-b68c-ca66c74a0ab6" containerID="669fe8612425d2ef1aa841bf298afe24adfb7f9ec7b22e39daad62a0f028b628" exitCode=0 Apr 23 16:39:09.307250 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:09.307193 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" event={"ID":"046b94e7-f586-4a7f-b68c-ca66c74a0ab6","Type":"ContainerDied","Data":"669fe8612425d2ef1aa841bf298afe24adfb7f9ec7b22e39daad62a0f028b628"} Apr 23 16:39:09.307493 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:09.307480 2580 scope.go:117] "RemoveContainer" containerID="669fe8612425d2ef1aa841bf298afe24adfb7f9ec7b22e39daad62a0f028b628" Apr 23 16:39:10.312223 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:10.312192 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-77tn6" event={"ID":"046b94e7-f586-4a7f-b68c-ca66c74a0ab6","Type":"ContainerStarted","Data":"e95adac27d915b2decc1494f824947fa3ba07d96557e3c41b8975c84a42eb567"} Apr 23 16:39:26.297355 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:26.297317 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:39:26.299618 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:26.299597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4033b659-eaae-4ad3-a8a3-523bdf5fcf89-metrics-certs\") pod \"network-metrics-daemon-5ps7g\" (UID: \"4033b659-eaae-4ad3-a8a3-523bdf5fcf89\") " pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:39:26.536437 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:26.536407 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nldpm\"" Apr 23 16:39:26.543575 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:26.543559 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5ps7g" Apr 23 16:39:26.662013 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:26.661978 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5ps7g"] Apr 23 16:39:26.665088 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:39:26.665057 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4033b659_eaae_4ad3_a8a3_523bdf5fcf89.slice/crio-4e80402bf00f596c20df7e8f9c01e050d18fbbc82230ca4c4adaa9f6e35de46b WatchSource:0}: Error finding container 4e80402bf00f596c20df7e8f9c01e050d18fbbc82230ca4c4adaa9f6e35de46b: Status 404 returned error can't find the container with id 4e80402bf00f596c20df7e8f9c01e050d18fbbc82230ca4c4adaa9f6e35de46b Apr 23 16:39:27.358813 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:27.358776 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5ps7g" event={"ID":"4033b659-eaae-4ad3-a8a3-523bdf5fcf89","Type":"ContainerStarted","Data":"4e80402bf00f596c20df7e8f9c01e050d18fbbc82230ca4c4adaa9f6e35de46b"} Apr 23 16:39:28.364573 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:28.364477 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5ps7g" event={"ID":"4033b659-eaae-4ad3-a8a3-523bdf5fcf89","Type":"ContainerStarted","Data":"935c26d936b311a609dae1533c19fc47c7b9b1024279d98dad768b21bfb70a10"} Apr 23 16:39:28.364573 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:28.364519 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5ps7g" event={"ID":"4033b659-eaae-4ad3-a8a3-523bdf5fcf89","Type":"ContainerStarted","Data":"39fe25147638d480a43c0d22639a39b65ea524f69ab11147115a33e17cb93dcb"} Apr 23 16:39:28.385453 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:28.385404 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5ps7g" podStartSLOduration=253.486074194 podStartE2EDuration="4m14.385389647s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:39:26.666846157 +0000 UTC m=+252.727920390" lastFinishedPulling="2026-04-23 16:39:27.56616161 +0000 UTC m=+253.627235843" observedRunningTime="2026-04-23 16:39:28.385275128 +0000 UTC m=+254.446349382" watchObservedRunningTime="2026-04-23 16:39:28.385389647 +0000 UTC m=+254.446463900" Apr 23 16:39:38.089695 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.089655 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68fdff544c-rq5nw"] Apr 23 16:39:38.090067 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.089973 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a538e20-bab6-46fe-9017-54d1d693ba8c" containerName="registry" Apr 23 16:39:38.090067 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.089985 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a538e20-bab6-46fe-9017-54d1d693ba8c" containerName="registry" Apr 23 16:39:38.090067 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.090007 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1e2c330-41e1-48b4-b81e-b9cfd0556afd" containerName="console" Apr 23 16:39:38.090067 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.090013 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e2c330-41e1-48b4-b81e-b9cfd0556afd" containerName="console" Apr 23 16:39:38.090067 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.090059 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1e2c330-41e1-48b4-b81e-b9cfd0556afd" containerName="console" Apr 23 16:39:38.090067 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.090071 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a538e20-bab6-46fe-9017-54d1d693ba8c" containerName="registry" Apr 23 16:39:38.091936 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.091908 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.117719 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.117695 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68fdff544c-rq5nw"] Apr 23 16:39:38.184762 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.184723 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-oauth-serving-cert\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.184762 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.184763 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-serving-cert\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.185009 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.184783 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-service-ca\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.185009 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.184801 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-977tq\" (UniqueName: \"kubernetes.io/projected/671f1ce5-4c41-43fe-92fa-f0a9b4491462-kube-api-access-977tq\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.185009 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.184860 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-oauth-config\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.185009 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.184941 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-config\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.185009 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.184971 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-trusted-ca-bundle\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.285556 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.285511 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-config\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.285556 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.285551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-trusted-ca-bundle\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.285836 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.285582 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-oauth-serving-cert\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.285836 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.285604 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-serving-cert\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.285836 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.285622 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-service-ca\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.285836 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.285736 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-977tq\" (UniqueName: \"kubernetes.io/projected/671f1ce5-4c41-43fe-92fa-f0a9b4491462-kube-api-access-977tq\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.285836 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.285810 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-oauth-config\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.286350 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.286317 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-config\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.286350 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.286329 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-service-ca\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.286499 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.286479 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-trusted-ca-bundle\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.286588 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.286563 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-oauth-serving-cert\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.288124 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.288101 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-serving-cert\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.288216 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.288199 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-oauth-config\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.294698 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.294670 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-977tq\" (UniqueName: \"kubernetes.io/projected/671f1ce5-4c41-43fe-92fa-f0a9b4491462-kube-api-access-977tq\") pod \"console-68fdff544c-rq5nw\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.400628 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.400557 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:38.524528 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:38.524499 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68fdff544c-rq5nw"] Apr 23 16:39:38.527563 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:39:38.527530 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671f1ce5_4c41_43fe_92fa_f0a9b4491462.slice/crio-e8e92368f6c34f0b804eaae8a38eb313f906ee92678915de86080e37eaff1a18 WatchSource:0}: Error finding container e8e92368f6c34f0b804eaae8a38eb313f906ee92678915de86080e37eaff1a18: Status 404 returned error can't find the container with id e8e92368f6c34f0b804eaae8a38eb313f906ee92678915de86080e37eaff1a18 Apr 23 16:39:39.396723 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:39.396683 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68fdff544c-rq5nw" event={"ID":"671f1ce5-4c41-43fe-92fa-f0a9b4491462","Type":"ContainerStarted","Data":"b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208"} Apr 23 16:39:39.396723 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:39.396724 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68fdff544c-rq5nw" event={"ID":"671f1ce5-4c41-43fe-92fa-f0a9b4491462","Type":"ContainerStarted","Data":"e8e92368f6c34f0b804eaae8a38eb313f906ee92678915de86080e37eaff1a18"} Apr 23 16:39:39.415986 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:39.415933 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68fdff544c-rq5nw" podStartSLOduration=1.415917815 podStartE2EDuration="1.415917815s" podCreationTimestamp="2026-04-23 16:39:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:39:39.415112528 +0000 UTC m=+265.476186791" watchObservedRunningTime="2026-04-23 16:39:39.415917815 +0000 UTC m=+265.476992068" Apr 23 16:39:48.400747 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:48.400706 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:48.400747 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:48.400753 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:48.405319 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:48.405277 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:48.424938 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:48.424910 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:39:48.479199 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:39:48.479169 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dfd99f556-drgmr"] Apr 23 16:40:13.498774 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.498717 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6dfd99f556-drgmr" podUID="5a031aac-7d65-4acb-845d-f7a232e069e7" containerName="console" containerID="cri-o://ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827" gracePeriod=15 Apr 23 16:40:13.738346 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.738322 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dfd99f556-drgmr_5a031aac-7d65-4acb-845d-f7a232e069e7/console/0.log" Apr 23 16:40:13.738453 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.738388 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:40:13.849016 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.848920 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-trusted-ca-bundle\") pod \"5a031aac-7d65-4acb-845d-f7a232e069e7\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " Apr 23 16:40:13.849016 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.848967 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-console-config\") pod \"5a031aac-7d65-4acb-845d-f7a232e069e7\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " Apr 23 16:40:13.849016 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.849007 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxw4z\" (UniqueName: \"kubernetes.io/projected/5a031aac-7d65-4acb-845d-f7a232e069e7-kube-api-access-zxw4z\") pod \"5a031aac-7d65-4acb-845d-f7a232e069e7\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " Apr 23 16:40:13.849316 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.849030 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-service-ca\") pod \"5a031aac-7d65-4acb-845d-f7a232e069e7\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " Apr 23 16:40:13.849316 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.849054 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-serving-cert\") pod \"5a031aac-7d65-4acb-845d-f7a232e069e7\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " Apr 23 16:40:13.849316 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.849095 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-oauth-config\") pod \"5a031aac-7d65-4acb-845d-f7a232e069e7\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " Apr 23 16:40:13.849316 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.849121 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-oauth-serving-cert\") pod \"5a031aac-7d65-4acb-845d-f7a232e069e7\" (UID: \"5a031aac-7d65-4acb-845d-f7a232e069e7\") " Apr 23 16:40:13.849526 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.849413 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5a031aac-7d65-4acb-845d-f7a232e069e7" (UID: "5a031aac-7d65-4acb-845d-f7a232e069e7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:40:13.849526 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.849483 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-service-ca" (OuterVolumeSpecName: "service-ca") pod "5a031aac-7d65-4acb-845d-f7a232e069e7" (UID: "5a031aac-7d65-4acb-845d-f7a232e069e7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:40:13.849631 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.849532 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-console-config" (OuterVolumeSpecName: "console-config") pod "5a031aac-7d65-4acb-845d-f7a232e069e7" (UID: "5a031aac-7d65-4acb-845d-f7a232e069e7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:40:13.849729 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.849707 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5a031aac-7d65-4acb-845d-f7a232e069e7" (UID: "5a031aac-7d65-4acb-845d-f7a232e069e7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:40:13.851277 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.851246 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5a031aac-7d65-4acb-845d-f7a232e069e7" (UID: "5a031aac-7d65-4acb-845d-f7a232e069e7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:40:13.851277 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.851263 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5a031aac-7d65-4acb-845d-f7a232e069e7" (UID: "5a031aac-7d65-4acb-845d-f7a232e069e7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:40:13.851436 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.851254 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a031aac-7d65-4acb-845d-f7a232e069e7-kube-api-access-zxw4z" (OuterVolumeSpecName: "kube-api-access-zxw4z") pod "5a031aac-7d65-4acb-845d-f7a232e069e7" (UID: "5a031aac-7d65-4acb-845d-f7a232e069e7"). InnerVolumeSpecName "kube-api-access-zxw4z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:40:13.950541 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.950505 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-trusted-ca-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:40:13.950541 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.950535 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-console-config\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:40:13.950541 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.950545 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zxw4z\" (UniqueName: \"kubernetes.io/projected/5a031aac-7d65-4acb-845d-f7a232e069e7-kube-api-access-zxw4z\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:40:13.950768 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.950555 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-service-ca\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:40:13.950768 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.950565 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-serving-cert\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:40:13.950768 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.950572 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a031aac-7d65-4acb-845d-f7a232e069e7-console-oauth-config\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:40:13.950768 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:13.950581 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a031aac-7d65-4acb-845d-f7a232e069e7-oauth-serving-cert\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:40:14.413116 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.413086 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dfd99f556-drgmr_5a031aac-7d65-4acb-845d-f7a232e069e7/console/0.log" Apr 23 16:40:14.413780 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.413758 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dfd99f556-drgmr_5a031aac-7d65-4acb-845d-f7a232e069e7/console/0.log" Apr 23 16:40:14.425443 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.425418 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:40:14.426424 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.426403 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:40:14.429049 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.429030 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 16:40:14.500573 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.500545 2580 generic.go:358] "Generic (PLEG): container finished" podID="5a031aac-7d65-4acb-845d-f7a232e069e7" containerID="ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827" exitCode=2 Apr 23 16:40:14.501975 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.500604 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dfd99f556-drgmr" Apr 23 16:40:14.501975 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.500607 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dfd99f556-drgmr" event={"ID":"5a031aac-7d65-4acb-845d-f7a232e069e7","Type":"ContainerDied","Data":"ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827"} Apr 23 16:40:14.501975 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.500633 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dfd99f556-drgmr" event={"ID":"5a031aac-7d65-4acb-845d-f7a232e069e7","Type":"ContainerDied","Data":"f4fdafd5a9c1cffbf58c720b1c3e1588927b33047d8d493b3ce8899ea5a7f1f9"} Apr 23 16:40:14.501975 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.500647 2580 scope.go:117] "RemoveContainer" containerID="ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827" Apr 23 16:40:14.508594 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.508579 2580 scope.go:117] "RemoveContainer" containerID="ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827" Apr 23 16:40:14.508836 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:40:14.508818 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827\": container with ID starting with ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827 not found: ID does not exist" containerID="ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827" Apr 23 16:40:14.508875 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.508843 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827"} err="failed to get container status \"ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827\": rpc error: code = NotFound desc = could not find container \"ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827\": container with ID starting with ec1bdf70f0d39ae6a5f5b593066c81316ac6541120d2cd94a2c54cbfc7a76827 not found: ID does not exist" Apr 23 16:40:14.537274 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.537245 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dfd99f556-drgmr"] Apr 23 16:40:14.541127 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:14.541103 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6dfd99f556-drgmr"] Apr 23 16:40:16.536472 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:16.536430 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a031aac-7d65-4acb-845d-f7a232e069e7" path="/var/lib/kubelet/pods/5a031aac-7d65-4acb-845d-f7a232e069e7/volumes" Apr 23 16:40:51.126480 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.126422 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66bd4b4847-f82xw"] Apr 23 16:40:51.126896 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.126753 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a031aac-7d65-4acb-845d-f7a232e069e7" containerName="console" Apr 23 16:40:51.126896 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.126765 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a031aac-7d65-4acb-845d-f7a232e069e7" containerName="console" Apr 23 16:40:51.126896 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.126817 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a031aac-7d65-4acb-845d-f7a232e069e7" containerName="console" Apr 23 16:40:51.128868 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.128850 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.130660 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.130636 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-serving-cert\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.130771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.130685 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8r9r\" (UniqueName: \"kubernetes.io/projected/0a4f8871-af5b-4b90-bf35-070cc28d88e0-kube-api-access-h8r9r\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.130771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.130756 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-oauth-config\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.130883 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.130860 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-service-ca\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.130924 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.130903 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-trusted-ca-bundle\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.130973 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.130934 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-oauth-serving-cert\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.131024 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.130987 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-config\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.195527 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.195493 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66bd4b4847-f82xw"] Apr 23 16:40:51.231357 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.231321 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-trusted-ca-bundle\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.231357 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.231359 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-oauth-serving-cert\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.231587 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.231408 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-config\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.231587 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.231456 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-serving-cert\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.231587 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.231502 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8r9r\" (UniqueName: \"kubernetes.io/projected/0a4f8871-af5b-4b90-bf35-070cc28d88e0-kube-api-access-h8r9r\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.231587 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.231534 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-oauth-config\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.231587 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.231558 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-service-ca\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.232168 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.232137 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-config\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.232315 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.232270 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-trusted-ca-bundle\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.232384 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.232270 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-oauth-serving-cert\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.232384 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.232276 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-service-ca\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.234023 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.233995 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-oauth-config\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.234111 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.234082 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-serving-cert\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.240814 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.240793 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8r9r\" (UniqueName: \"kubernetes.io/projected/0a4f8871-af5b-4b90-bf35-070cc28d88e0-kube-api-access-h8r9r\") pod \"console-66bd4b4847-f82xw\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.437912 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.437819 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:40:51.566361 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.566327 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66bd4b4847-f82xw"] Apr 23 16:40:51.569374 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:40:51.569342 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a4f8871_af5b_4b90_bf35_070cc28d88e0.slice/crio-cf47cb25b781aad2c97f5c4002d86a5d0dee577175571075752cc15633adda02 WatchSource:0}: Error finding container cf47cb25b781aad2c97f5c4002d86a5d0dee577175571075752cc15633adda02: Status 404 returned error can't find the container with id cf47cb25b781aad2c97f5c4002d86a5d0dee577175571075752cc15633adda02 Apr 23 16:40:51.571080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.571065 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:40:51.604763 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:51.604727 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66bd4b4847-f82xw" event={"ID":"0a4f8871-af5b-4b90-bf35-070cc28d88e0","Type":"ContainerStarted","Data":"cf47cb25b781aad2c97f5c4002d86a5d0dee577175571075752cc15633adda02"} Apr 23 16:40:52.608704 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:52.608666 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66bd4b4847-f82xw" event={"ID":"0a4f8871-af5b-4b90-bf35-070cc28d88e0","Type":"ContainerStarted","Data":"b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623"} Apr 23 16:40:52.629172 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:40:52.629118 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66bd4b4847-f82xw" podStartSLOduration=1.6291040159999999 podStartE2EDuration="1.629104016s" podCreationTimestamp="2026-04-23 16:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:40:52.627968788 +0000 UTC m=+338.689043042" watchObservedRunningTime="2026-04-23 16:40:52.629104016 +0000 UTC m=+338.690178270" Apr 23 16:41:01.438954 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:01.438858 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:41:01.439412 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:01.438961 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:41:01.443693 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:01.443672 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:41:01.638080 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:01.638049 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:41:01.689551 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:01.689473 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68fdff544c-rq5nw"] Apr 23 16:41:26.708427 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.708364 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68fdff544c-rq5nw" podUID="671f1ce5-4c41-43fe-92fa-f0a9b4491462" containerName="console" containerID="cri-o://b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208" gracePeriod=15 Apr 23 16:41:26.945634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.945607 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68fdff544c-rq5nw_671f1ce5-4c41-43fe-92fa-f0a9b4491462/console/0.log" Apr 23 16:41:26.945758 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.945671 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:41:26.992150 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992122 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-trusted-ca-bundle\") pod \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " Apr 23 16:41:26.992310 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992183 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-oauth-serving-cert\") pod \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " Apr 23 16:41:26.992369 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992308 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-serving-cert\") pod \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " Apr 23 16:41:26.992369 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992341 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-service-ca\") pod \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " Apr 23 16:41:26.992369 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992367 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-977tq\" (UniqueName: \"kubernetes.io/projected/671f1ce5-4c41-43fe-92fa-f0a9b4491462-kube-api-access-977tq\") pod \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " Apr 23 16:41:26.992518 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992389 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-oauth-config\") pod \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " Apr 23 16:41:26.992518 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992415 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-config\") pod \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\" (UID: \"671f1ce5-4c41-43fe-92fa-f0a9b4491462\") " Apr 23 16:41:26.992620 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992596 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "671f1ce5-4c41-43fe-92fa-f0a9b4491462" (UID: "671f1ce5-4c41-43fe-92fa-f0a9b4491462"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:41:26.992675 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992614 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "671f1ce5-4c41-43fe-92fa-f0a9b4491462" (UID: "671f1ce5-4c41-43fe-92fa-f0a9b4491462"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:41:26.992675 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992631 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-service-ca" (OuterVolumeSpecName: "service-ca") pod "671f1ce5-4c41-43fe-92fa-f0a9b4491462" (UID: "671f1ce5-4c41-43fe-92fa-f0a9b4491462"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:41:26.992786 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992691 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-trusted-ca-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:41:26.992786 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992711 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-oauth-serving-cert\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:41:26.992786 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992726 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-service-ca\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:41:26.992923 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.992885 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-config" (OuterVolumeSpecName: "console-config") pod "671f1ce5-4c41-43fe-92fa-f0a9b4491462" (UID: "671f1ce5-4c41-43fe-92fa-f0a9b4491462"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:41:26.994591 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.994562 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "671f1ce5-4c41-43fe-92fa-f0a9b4491462" (UID: "671f1ce5-4c41-43fe-92fa-f0a9b4491462"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:41:26.994710 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.994576 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671f1ce5-4c41-43fe-92fa-f0a9b4491462-kube-api-access-977tq" (OuterVolumeSpecName: "kube-api-access-977tq") pod "671f1ce5-4c41-43fe-92fa-f0a9b4491462" (UID: "671f1ce5-4c41-43fe-92fa-f0a9b4491462"). InnerVolumeSpecName "kube-api-access-977tq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:41:26.994710 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:26.994590 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "671f1ce5-4c41-43fe-92fa-f0a9b4491462" (UID: "671f1ce5-4c41-43fe-92fa-f0a9b4491462"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:41:27.093498 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.093463 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-serving-cert\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:41:27.093498 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.093493 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-977tq\" (UniqueName: \"kubernetes.io/projected/671f1ce5-4c41-43fe-92fa-f0a9b4491462-kube-api-access-977tq\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:41:27.093498 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.093503 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-oauth-config\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:41:27.093724 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.093514 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/671f1ce5-4c41-43fe-92fa-f0a9b4491462-console-config\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:41:27.705146 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.705102 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68fdff544c-rq5nw_671f1ce5-4c41-43fe-92fa-f0a9b4491462/console/0.log" Apr 23 16:41:27.705318 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.705158 2580 generic.go:358] "Generic (PLEG): container finished" podID="671f1ce5-4c41-43fe-92fa-f0a9b4491462" containerID="b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208" exitCode=2 Apr 23 16:41:27.705318 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.705223 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68fdff544c-rq5nw" Apr 23 16:41:27.705318 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.705233 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68fdff544c-rq5nw" event={"ID":"671f1ce5-4c41-43fe-92fa-f0a9b4491462","Type":"ContainerDied","Data":"b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208"} Apr 23 16:41:27.705318 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.705259 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68fdff544c-rq5nw" event={"ID":"671f1ce5-4c41-43fe-92fa-f0a9b4491462","Type":"ContainerDied","Data":"e8e92368f6c34f0b804eaae8a38eb313f906ee92678915de86080e37eaff1a18"} Apr 23 16:41:27.705318 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.705274 2580 scope.go:117] "RemoveContainer" containerID="b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208" Apr 23 16:41:27.713656 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.713453 2580 scope.go:117] "RemoveContainer" containerID="b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208" Apr 23 16:41:27.713871 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:41:27.713705 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208\": container with ID starting with b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208 not found: ID does not exist" containerID="b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208" Apr 23 16:41:27.713871 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.713732 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208"} err="failed to get container status \"b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208\": rpc error: code = NotFound desc = could not find container \"b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208\": container with ID starting with b7e9f5e855a7663c126b9fdf7fa4f245105a8d30ac1a6691d9d5bef63f998208 not found: ID does not exist" Apr 23 16:41:27.726889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.726864 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68fdff544c-rq5nw"] Apr 23 16:41:27.732587 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:27.732558 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68fdff544c-rq5nw"] Apr 23 16:41:28.536835 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:28.536800 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671f1ce5-4c41-43fe-92fa-f0a9b4491462" path="/var/lib/kubelet/pods/671f1ce5-4c41-43fe-92fa-f0a9b4491462/volumes" Apr 23 16:41:47.979458 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:47.979425 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv"] Apr 23 16:41:47.979890 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:47.979713 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671f1ce5-4c41-43fe-92fa-f0a9b4491462" containerName="console" Apr 23 16:41:47.979890 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:47.979725 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="671f1ce5-4c41-43fe-92fa-f0a9b4491462" containerName="console" Apr 23 16:41:47.979890 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:47.979780 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="671f1ce5-4c41-43fe-92fa-f0a9b4491462" containerName="console" Apr 23 16:41:47.982905 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:47.982888 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:41:47.985858 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:47.985836 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 16:41:47.986902 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:47.986884 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5s2x5\"" Apr 23 16:41:47.986950 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:47.986887 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 16:41:47.989599 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:47.989575 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv"] Apr 23 16:41:48.057172 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.057134 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkmmn\" (UniqueName: \"kubernetes.io/projected/d518e4b6-a022-449e-ba41-4e6e851c03b6-kube-api-access-lkmmn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:41:48.057374 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.057179 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:41:48.057374 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.057267 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:41:48.158593 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.158560 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:41:48.159125 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.159089 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkmmn\" (UniqueName: \"kubernetes.io/projected/d518e4b6-a022-449e-ba41-4e6e851c03b6-kube-api-access-lkmmn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:41:48.159236 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.159167 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:41:48.159428 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.159401 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:41:48.159554 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.159532 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:41:48.168076 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.168051 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkmmn\" (UniqueName: \"kubernetes.io/projected/d518e4b6-a022-449e-ba41-4e6e851c03b6-kube-api-access-lkmmn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:41:48.293324 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.293209 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:41:48.417008 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.416983 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv"] Apr 23 16:41:48.419505 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:41:48.419474 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd518e4b6_a022_449e_ba41_4e6e851c03b6.slice/crio-e59f895e5eba88751212d1c8d68bc172d125b9ca04c159b663b202ea1e2bcf3b WatchSource:0}: Error finding container e59f895e5eba88751212d1c8d68bc172d125b9ca04c159b663b202ea1e2bcf3b: Status 404 returned error can't find the container with id e59f895e5eba88751212d1c8d68bc172d125b9ca04c159b663b202ea1e2bcf3b Apr 23 16:41:48.764376 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:48.764336 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" event={"ID":"d518e4b6-a022-449e-ba41-4e6e851c03b6","Type":"ContainerStarted","Data":"e59f895e5eba88751212d1c8d68bc172d125b9ca04c159b663b202ea1e2bcf3b"} Apr 23 16:41:55.784628 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:55.784600 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" event={"ID":"d518e4b6-a022-449e-ba41-4e6e851c03b6","Type":"ContainerStarted","Data":"f847c640624d0721c1bd5299e62a8504bc7862163b10a94fe01d6c3e50cf4614"} Apr 23 16:41:56.788924 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:56.788890 2580 generic.go:358] "Generic (PLEG): container finished" podID="d518e4b6-a022-449e-ba41-4e6e851c03b6" containerID="f847c640624d0721c1bd5299e62a8504bc7862163b10a94fe01d6c3e50cf4614" exitCode=0 Apr 23 16:41:56.789338 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:56.788928 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" event={"ID":"d518e4b6-a022-449e-ba41-4e6e851c03b6","Type":"ContainerDied","Data":"f847c640624d0721c1bd5299e62a8504bc7862163b10a94fe01d6c3e50cf4614"} Apr 23 16:41:59.802041 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:59.802003 2580 generic.go:358] "Generic (PLEG): container finished" podID="d518e4b6-a022-449e-ba41-4e6e851c03b6" containerID="04082d4b74d12cbdd0910db32ead058e33347784eb736cf20b5a3f009ff70f98" exitCode=0 Apr 23 16:41:59.802428 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:41:59.802091 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" event={"ID":"d518e4b6-a022-449e-ba41-4e6e851c03b6","Type":"ContainerDied","Data":"04082d4b74d12cbdd0910db32ead058e33347784eb736cf20b5a3f009ff70f98"} Apr 23 16:42:07.825973 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:07.825940 2580 generic.go:358] "Generic (PLEG): container finished" podID="d518e4b6-a022-449e-ba41-4e6e851c03b6" containerID="8f5325244465f7941151b902fdf1de280a1c8f19b57cd4b383b2a6b6a953e027" exitCode=0 Apr 23 16:42:07.826400 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:07.826026 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" event={"ID":"d518e4b6-a022-449e-ba41-4e6e851c03b6","Type":"ContainerDied","Data":"8f5325244465f7941151b902fdf1de280a1c8f19b57cd4b383b2a6b6a953e027"} Apr 23 16:42:08.944964 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:08.944943 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:42:09.037965 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.037925 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-bundle\") pod \"d518e4b6-a022-449e-ba41-4e6e851c03b6\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " Apr 23 16:42:09.037965 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.037975 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkmmn\" (UniqueName: \"kubernetes.io/projected/d518e4b6-a022-449e-ba41-4e6e851c03b6-kube-api-access-lkmmn\") pod \"d518e4b6-a022-449e-ba41-4e6e851c03b6\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " Apr 23 16:42:09.038169 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.037990 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-util\") pod \"d518e4b6-a022-449e-ba41-4e6e851c03b6\" (UID: \"d518e4b6-a022-449e-ba41-4e6e851c03b6\") " Apr 23 16:42:09.038659 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.038625 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-bundle" (OuterVolumeSpecName: "bundle") pod "d518e4b6-a022-449e-ba41-4e6e851c03b6" (UID: "d518e4b6-a022-449e-ba41-4e6e851c03b6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:42:09.040229 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.040200 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d518e4b6-a022-449e-ba41-4e6e851c03b6-kube-api-access-lkmmn" (OuterVolumeSpecName: "kube-api-access-lkmmn") pod "d518e4b6-a022-449e-ba41-4e6e851c03b6" (UID: "d518e4b6-a022-449e-ba41-4e6e851c03b6"). InnerVolumeSpecName "kube-api-access-lkmmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:42:09.042645 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.042623 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-util" (OuterVolumeSpecName: "util") pod "d518e4b6-a022-449e-ba41-4e6e851c03b6" (UID: "d518e4b6-a022-449e-ba41-4e6e851c03b6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:42:09.139537 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.139456 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:42:09.139537 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.139487 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lkmmn\" (UniqueName: \"kubernetes.io/projected/d518e4b6-a022-449e-ba41-4e6e851c03b6-kube-api-access-lkmmn\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:42:09.139537 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.139497 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d518e4b6-a022-449e-ba41-4e6e851c03b6-util\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:42:09.832720 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.832677 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" event={"ID":"d518e4b6-a022-449e-ba41-4e6e851c03b6","Type":"ContainerDied","Data":"e59f895e5eba88751212d1c8d68bc172d125b9ca04c159b663b202ea1e2bcf3b"} Apr 23 16:42:09.832720 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.832721 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e59f895e5eba88751212d1c8d68bc172d125b9ca04c159b663b202ea1e2bcf3b" Apr 23 16:42:09.832720 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:09.832725 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnjqjv" Apr 23 16:42:15.541735 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.541699 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg"] Apr 23 16:42:15.542103 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.542031 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d518e4b6-a022-449e-ba41-4e6e851c03b6" containerName="util" Apr 23 16:42:15.542103 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.542046 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d518e4b6-a022-449e-ba41-4e6e851c03b6" containerName="util" Apr 23 16:42:15.542103 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.542066 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d518e4b6-a022-449e-ba41-4e6e851c03b6" containerName="pull" Apr 23 16:42:15.542103 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.542072 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d518e4b6-a022-449e-ba41-4e6e851c03b6" containerName="pull" Apr 23 16:42:15.542103 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.542078 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d518e4b6-a022-449e-ba41-4e6e851c03b6" containerName="extract" Apr 23 16:42:15.542103 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.542085 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d518e4b6-a022-449e-ba41-4e6e851c03b6" containerName="extract" Apr 23 16:42:15.542303 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.542137 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d518e4b6-a022-449e-ba41-4e6e851c03b6" containerName="extract" Apr 23 16:42:15.549448 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.549428 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" Apr 23 16:42:15.554713 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.554687 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 16:42:15.554836 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.554698 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-44pz7\"" Apr 23 16:42:15.555047 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.555026 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 16:42:15.555158 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.555107 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 16:42:15.570017 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.569981 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg"] Apr 23 16:42:15.587449 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.587416 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/60579d7f-243c-4ce3-b297-054d0defd1b2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg\" (UID: \"60579d7f-243c-4ce3-b297-054d0defd1b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" Apr 23 16:42:15.587755 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.587732 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgr26\" (UniqueName: \"kubernetes.io/projected/60579d7f-243c-4ce3-b297-054d0defd1b2-kube-api-access-xgr26\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg\" (UID: \"60579d7f-243c-4ce3-b297-054d0defd1b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" Apr 23 16:42:15.688488 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.688436 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/60579d7f-243c-4ce3-b297-054d0defd1b2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg\" (UID: \"60579d7f-243c-4ce3-b297-054d0defd1b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" Apr 23 16:42:15.688677 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.688506 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgr26\" (UniqueName: \"kubernetes.io/projected/60579d7f-243c-4ce3-b297-054d0defd1b2-kube-api-access-xgr26\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg\" (UID: \"60579d7f-243c-4ce3-b297-054d0defd1b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" Apr 23 16:42:15.690794 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.690769 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/60579d7f-243c-4ce3-b297-054d0defd1b2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg\" (UID: \"60579d7f-243c-4ce3-b297-054d0defd1b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" Apr 23 16:42:15.697925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.697897 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgr26\" (UniqueName: \"kubernetes.io/projected/60579d7f-243c-4ce3-b297-054d0defd1b2-kube-api-access-xgr26\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg\" (UID: \"60579d7f-243c-4ce3-b297-054d0defd1b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" Apr 23 16:42:15.858838 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.858757 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" Apr 23 16:42:15.986366 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:15.986344 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg"] Apr 23 16:42:15.989274 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:42:15.989243 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60579d7f_243c_4ce3_b297_054d0defd1b2.slice/crio-fd9eb7c811b6fcc4bac5a9a7966b8016afd7582277082b7b6a25ce13c2de3431 WatchSource:0}: Error finding container fd9eb7c811b6fcc4bac5a9a7966b8016afd7582277082b7b6a25ce13c2de3431: Status 404 returned error can't find the container with id fd9eb7c811b6fcc4bac5a9a7966b8016afd7582277082b7b6a25ce13c2de3431 Apr 23 16:42:16.856155 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:16.856119 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" event={"ID":"60579d7f-243c-4ce3-b297-054d0defd1b2","Type":"ContainerStarted","Data":"fd9eb7c811b6fcc4bac5a9a7966b8016afd7582277082b7b6a25ce13c2de3431"} Apr 23 16:42:19.608105 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.608069 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-zbw6v"] Apr 23 16:42:19.611736 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.611718 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:19.615211 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.615188 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 16:42:19.615371 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.615195 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 23 16:42:19.615531 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.615508 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-8h58n\"" Apr 23 16:42:19.621425 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.621406 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:19.621529 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.621515 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-cabundle0\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:19.621576 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.621539 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6kmd\" (UniqueName: \"kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-kube-api-access-g6kmd\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:19.627151 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.627132 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-zbw6v"] Apr 23 16:42:19.722542 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.722508 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:19.722542 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.722545 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-cabundle0\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:19.722791 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.722566 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6kmd\" (UniqueName: \"kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-kube-api-access-g6kmd\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:19.722791 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:19.722641 2580 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:42:19.722791 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:19.722661 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:42:19.722791 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:19.722670 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zbw6v: references non-existent secret key: ca.crt Apr 23 16:42:19.722791 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:19.722731 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates podName:e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f nodeName:}" failed. No retries permitted until 2026-04-23 16:42:20.222715859 +0000 UTC m=+426.283790092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates") pod "keda-operator-ffbb595cb-zbw6v" (UID: "e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f") : references non-existent secret key: ca.crt Apr 23 16:42:19.723238 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.723210 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-cabundle0\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:19.742717 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.742687 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6kmd\" (UniqueName: \"kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-kube-api-access-g6kmd\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:19.869141 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.869049 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" event={"ID":"60579d7f-243c-4ce3-b297-054d0defd1b2","Type":"ContainerStarted","Data":"74870070d445c33cb1e9fcd764e44d0f1bbae5d017973ba3f3f3385243c01dc3"} Apr 23 16:42:19.869141 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.869115 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" Apr 23 16:42:19.889637 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.889572 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" podStartSLOduration=1.781168023 podStartE2EDuration="4.889556419s" podCreationTimestamp="2026-04-23 16:42:15 +0000 UTC" firstStartedPulling="2026-04-23 16:42:15.990993408 +0000 UTC m=+422.052067640" lastFinishedPulling="2026-04-23 16:42:19.099381798 +0000 UTC m=+425.160456036" observedRunningTime="2026-04-23 16:42:19.888503901 +0000 UTC m=+425.949578192" watchObservedRunningTime="2026-04-23 16:42:19.889556419 +0000 UTC m=+425.950630673" Apr 23 16:42:19.945679 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.945649 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-759fn"] Apr 23 16:42:19.949003 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.948982 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:19.951483 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.951463 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 23 16:42:19.957038 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:19.957016 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-759fn"] Apr 23 16:42:20.024389 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.024357 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5brx\" (UniqueName: \"kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-kube-api-access-n5brx\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:20.024553 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.024400 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/77503e54-fb3e-4f6b-91c8-243d7b500a37-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:20.024553 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.024470 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:20.125154 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.125071 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5brx\" (UniqueName: \"kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-kube-api-access-n5brx\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:20.125154 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.125138 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/77503e54-fb3e-4f6b-91c8-243d7b500a37-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:20.125381 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.125212 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:20.125381 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.125358 2580 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:42:20.125381 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.125379 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:42:20.125523 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.125402 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-759fn: references non-existent secret key: tls.crt Apr 23 16:42:20.125523 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.125466 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates podName:77503e54-fb3e-4f6b-91c8-243d7b500a37 nodeName:}" failed. No retries permitted until 2026-04-23 16:42:20.625447121 +0000 UTC m=+426.686521361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates") pod "keda-metrics-apiserver-7c9f485588-759fn" (UID: "77503e54-fb3e-4f6b-91c8-243d7b500a37") : references non-existent secret key: tls.crt Apr 23 16:42:20.125637 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.125604 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/77503e54-fb3e-4f6b-91c8-243d7b500a37-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:20.135413 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.135381 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5brx\" (UniqueName: \"kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-kube-api-access-n5brx\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:20.172778 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.172743 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-s6mxt"] Apr 23 16:42:20.176029 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.176014 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-s6mxt" Apr 23 16:42:20.179423 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.179400 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 23 16:42:20.194984 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.194960 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-s6mxt"] Apr 23 16:42:20.225714 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.225689 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6k9v\" (UniqueName: \"kubernetes.io/projected/d57c31c2-0ec3-4660-b723-18f9e60c59eb-kube-api-access-b6k9v\") pod \"keda-admission-cf49989db-s6mxt\" (UID: \"d57c31c2-0ec3-4660-b723-18f9e60c59eb\") " pod="openshift-keda/keda-admission-cf49989db-s6mxt" Apr 23 16:42:20.225852 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.225767 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:20.225852 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.225808 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d57c31c2-0ec3-4660-b723-18f9e60c59eb-certificates\") pod \"keda-admission-cf49989db-s6mxt\" (UID: \"d57c31c2-0ec3-4660-b723-18f9e60c59eb\") " pod="openshift-keda/keda-admission-cf49989db-s6mxt" Apr 23 16:42:20.225937 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.225891 2580 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:42:20.225937 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.225909 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:42:20.225937 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.225917 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zbw6v: references non-existent secret key: ca.crt Apr 23 16:42:20.226034 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.225963 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates podName:e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f nodeName:}" failed. No retries permitted until 2026-04-23 16:42:21.22594879 +0000 UTC m=+427.287023022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates") pod "keda-operator-ffbb595cb-zbw6v" (UID: "e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f") : references non-existent secret key: ca.crt Apr 23 16:42:20.326866 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.326802 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6k9v\" (UniqueName: \"kubernetes.io/projected/d57c31c2-0ec3-4660-b723-18f9e60c59eb-kube-api-access-b6k9v\") pod \"keda-admission-cf49989db-s6mxt\" (UID: \"d57c31c2-0ec3-4660-b723-18f9e60c59eb\") " pod="openshift-keda/keda-admission-cf49989db-s6mxt" Apr 23 16:42:20.327017 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.326963 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d57c31c2-0ec3-4660-b723-18f9e60c59eb-certificates\") pod \"keda-admission-cf49989db-s6mxt\" (UID: \"d57c31c2-0ec3-4660-b723-18f9e60c59eb\") " pod="openshift-keda/keda-admission-cf49989db-s6mxt" Apr 23 16:42:20.329463 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.329441 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d57c31c2-0ec3-4660-b723-18f9e60c59eb-certificates\") pod \"keda-admission-cf49989db-s6mxt\" (UID: \"d57c31c2-0ec3-4660-b723-18f9e60c59eb\") " pod="openshift-keda/keda-admission-cf49989db-s6mxt" Apr 23 16:42:20.339522 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.339495 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6k9v\" (UniqueName: \"kubernetes.io/projected/d57c31c2-0ec3-4660-b723-18f9e60c59eb-kube-api-access-b6k9v\") pod \"keda-admission-cf49989db-s6mxt\" (UID: \"d57c31c2-0ec3-4660-b723-18f9e60c59eb\") " pod="openshift-keda/keda-admission-cf49989db-s6mxt" Apr 23 16:42:20.486259 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.486224 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-s6mxt" Apr 23 16:42:20.611515 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.611485 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-s6mxt"] Apr 23 16:42:20.613196 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:42:20.613166 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd57c31c2_0ec3_4660_b723_18f9e60c59eb.slice/crio-2ca91c3c80a5f0f79290d70d4e13e8988d7e5f6d9515051d11533f8e72560403 WatchSource:0}: Error finding container 2ca91c3c80a5f0f79290d70d4e13e8988d7e5f6d9515051d11533f8e72560403: Status 404 returned error can't find the container with id 2ca91c3c80a5f0f79290d70d4e13e8988d7e5f6d9515051d11533f8e72560403 Apr 23 16:42:20.629825 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.629798 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:20.629924 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.629912 2580 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:42:20.629966 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.629926 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:42:20.629966 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.629945 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-759fn: references non-existent secret key: tls.crt Apr 23 16:42:20.630025 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:20.629983 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates podName:77503e54-fb3e-4f6b-91c8-243d7b500a37 nodeName:}" failed. No retries permitted until 2026-04-23 16:42:21.629972478 +0000 UTC m=+427.691046710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates") pod "keda-metrics-apiserver-7c9f485588-759fn" (UID: "77503e54-fb3e-4f6b-91c8-243d7b500a37") : references non-existent secret key: tls.crt Apr 23 16:42:20.874883 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:20.874796 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-s6mxt" event={"ID":"d57c31c2-0ec3-4660-b723-18f9e60c59eb","Type":"ContainerStarted","Data":"2ca91c3c80a5f0f79290d70d4e13e8988d7e5f6d9515051d11533f8e72560403"} Apr 23 16:42:21.235425 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:21.235392 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:21.235589 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:21.235512 2580 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:42:21.235589 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:21.235524 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:42:21.235589 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:21.235533 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zbw6v: references non-existent secret key: ca.crt Apr 23 16:42:21.235589 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:21.235577 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates podName:e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f nodeName:}" failed. No retries permitted until 2026-04-23 16:42:23.235565126 +0000 UTC m=+429.296639358 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates") pod "keda-operator-ffbb595cb-zbw6v" (UID: "e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f") : references non-existent secret key: ca.crt Apr 23 16:42:21.638824 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:21.638743 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:21.639159 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:21.638889 2580 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:42:21.639159 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:21.638908 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:42:21.639159 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:21.638925 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-759fn: references non-existent secret key: tls.crt Apr 23 16:42:21.639159 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:21.638975 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates podName:77503e54-fb3e-4f6b-91c8-243d7b500a37 nodeName:}" failed. No retries permitted until 2026-04-23 16:42:23.638961562 +0000 UTC m=+429.700035793 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates") pod "keda-metrics-apiserver-7c9f485588-759fn" (UID: "77503e54-fb3e-4f6b-91c8-243d7b500a37") : references non-existent secret key: tls.crt Apr 23 16:42:23.250611 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:23.250568 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:23.250994 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:23.250713 2580 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:42:23.250994 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:23.250735 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:42:23.250994 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:23.250744 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zbw6v: references non-existent secret key: ca.crt Apr 23 16:42:23.250994 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:23.250794 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates podName:e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f nodeName:}" failed. No retries permitted until 2026-04-23 16:42:27.250780274 +0000 UTC m=+433.311854511 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates") pod "keda-operator-ffbb595cb-zbw6v" (UID: "e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f") : references non-existent secret key: ca.crt Apr 23 16:42:23.653356 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:23.653241 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:23.653498 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:23.653396 2580 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:42:23.653498 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:23.653418 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:42:23.653498 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:23.653435 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-759fn: references non-existent secret key: tls.crt Apr 23 16:42:23.653498 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:42:23.653487 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates podName:77503e54-fb3e-4f6b-91c8-243d7b500a37 nodeName:}" failed. No retries permitted until 2026-04-23 16:42:27.653471959 +0000 UTC m=+433.714546195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates") pod "keda-metrics-apiserver-7c9f485588-759fn" (UID: "77503e54-fb3e-4f6b-91c8-243d7b500a37") : references non-existent secret key: tls.crt Apr 23 16:42:27.282510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:27.282475 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:27.284913 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:27.284887 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f-certificates\") pod \"keda-operator-ffbb595cb-zbw6v\" (UID: \"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f\") " pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:27.422357 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:27.422314 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:27.544564 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:27.544537 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-zbw6v"] Apr 23 16:42:27.547317 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:42:27.547274 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8a68e96_d63a_4d8d_a3a6_daeb9440cd6f.slice/crio-451a7bb8614fec0d9a2bbd32d735a6a75dac5ff7ea79f52e5113a25b654ddd81 WatchSource:0}: Error finding container 451a7bb8614fec0d9a2bbd32d735a6a75dac5ff7ea79f52e5113a25b654ddd81: Status 404 returned error can't find the container with id 451a7bb8614fec0d9a2bbd32d735a6a75dac5ff7ea79f52e5113a25b654ddd81 Apr 23 16:42:27.686458 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:27.686418 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:27.689043 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:27.688995 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/77503e54-fb3e-4f6b-91c8-243d7b500a37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-759fn\" (UID: \"77503e54-fb3e-4f6b-91c8-243d7b500a37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:27.760468 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:27.760438 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:27.880130 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:27.880096 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-759fn"] Apr 23 16:42:27.882861 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:42:27.882833 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77503e54_fb3e_4f6b_91c8_243d7b500a37.slice/crio-2e5a49be25d1f1c642860ef0522b59c5a6d6075f23a5507b34ad12ed8faf5764 WatchSource:0}: Error finding container 2e5a49be25d1f1c642860ef0522b59c5a6d6075f23a5507b34ad12ed8faf5764: Status 404 returned error can't find the container with id 2e5a49be25d1f1c642860ef0522b59c5a6d6075f23a5507b34ad12ed8faf5764 Apr 23 16:42:27.897402 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:27.897376 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" event={"ID":"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f","Type":"ContainerStarted","Data":"451a7bb8614fec0d9a2bbd32d735a6a75dac5ff7ea79f52e5113a25b654ddd81"} Apr 23 16:42:27.898335 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:27.898312 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" event={"ID":"77503e54-fb3e-4f6b-91c8-243d7b500a37","Type":"ContainerStarted","Data":"2e5a49be25d1f1c642860ef0522b59c5a6d6075f23a5507b34ad12ed8faf5764"} Apr 23 16:42:31.915329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:31.915222 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" event={"ID":"e8a68e96-d63a-4d8d-a3a6-daeb9440cd6f","Type":"ContainerStarted","Data":"07db786b2cc2262e722820eee01c75cd9fba2553f3ef1de34bb7781c6b36e7dd"} Apr 23 16:42:31.915757 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:31.915349 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:31.916881 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:31.916854 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-s6mxt" event={"ID":"d57c31c2-0ec3-4660-b723-18f9e60c59eb","Type":"ContainerStarted","Data":"e923b0de6811332f0d552d3a28feb503d5f12057456d44ff12a68f5877956c20"} Apr 23 16:42:31.917014 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:31.916977 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-s6mxt" Apr 23 16:42:31.918361 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:31.918317 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" event={"ID":"77503e54-fb3e-4f6b-91c8-243d7b500a37","Type":"ContainerStarted","Data":"6a5a816c2d7f58e1ae708402df393888c72fcb94615fdd9ebf0d36b0a92bade2"} Apr 23 16:42:31.918527 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:31.918512 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:31.933201 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:31.933157 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" podStartSLOduration=8.893151191 podStartE2EDuration="12.933144774s" podCreationTimestamp="2026-04-23 16:42:19 +0000 UTC" firstStartedPulling="2026-04-23 16:42:27.548588901 +0000 UTC m=+433.609663132" lastFinishedPulling="2026-04-23 16:42:31.58858248 +0000 UTC m=+437.649656715" observedRunningTime="2026-04-23 16:42:31.931611586 +0000 UTC m=+437.992685841" watchObservedRunningTime="2026-04-23 16:42:31.933144774 +0000 UTC m=+437.994219028" Apr 23 16:42:31.948652 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:31.948606 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" podStartSLOduration=9.243576525 podStartE2EDuration="12.948593019s" podCreationTimestamp="2026-04-23 16:42:19 +0000 UTC" firstStartedPulling="2026-04-23 16:42:27.884192797 +0000 UTC m=+433.945267029" lastFinishedPulling="2026-04-23 16:42:31.589209291 +0000 UTC m=+437.650283523" observedRunningTime="2026-04-23 16:42:31.947901321 +0000 UTC m=+438.008975598" watchObservedRunningTime="2026-04-23 16:42:31.948593019 +0000 UTC m=+438.009667272" Apr 23 16:42:31.966668 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:31.966626 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-s6mxt" podStartSLOduration=0.942859835 podStartE2EDuration="11.966614725s" podCreationTimestamp="2026-04-23 16:42:20 +0000 UTC" firstStartedPulling="2026-04-23 16:42:20.614508871 +0000 UTC m=+426.675583102" lastFinishedPulling="2026-04-23 16:42:31.638263757 +0000 UTC m=+437.699337992" observedRunningTime="2026-04-23 16:42:31.965514503 +0000 UTC m=+438.026588770" watchObservedRunningTime="2026-04-23 16:42:31.966614725 +0000 UTC m=+438.027689014" Apr 23 16:42:40.877766 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:40.877732 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q6nrg" Apr 23 16:42:42.926844 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:42.926809 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-759fn" Apr 23 16:42:52.924005 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:52.923970 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-zbw6v" Apr 23 16:42:52.924479 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:42:52.924233 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-s6mxt" Apr 23 16:43:13.631479 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.631445 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l"] Apr 23 16:43:13.637063 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.637042 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:13.640125 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.640102 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 16:43:13.640898 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.640860 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5s2x5\"" Apr 23 16:43:13.641018 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.640899 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 16:43:13.641577 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.641557 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l"] Apr 23 16:43:13.749797 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.749758 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4p48\" (UniqueName: \"kubernetes.io/projected/9d261d73-9f66-4169-8360-c28ef43f3d30-kube-api-access-w4p48\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:13.749978 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.749801 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:13.749978 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.749844 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:13.850873 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.850835 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4p48\" (UniqueName: \"kubernetes.io/projected/9d261d73-9f66-4169-8360-c28ef43f3d30-kube-api-access-w4p48\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:13.851036 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.850886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:13.851036 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.850937 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:13.851315 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.851273 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:13.851364 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.851348 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:13.859117 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.859094 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4p48\" (UniqueName: \"kubernetes.io/projected/9d261d73-9f66-4169-8360-c28ef43f3d30-kube-api-access-w4p48\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:13.948311 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:13.948208 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:14.069394 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:14.069361 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l"] Apr 23 16:43:14.072925 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:43:14.072892 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d261d73_9f66_4169_8360_c28ef43f3d30.slice/crio-f4665f037c11721346b4ef0b9d0988274ca6a3b96dfb08dee2aac570ab091637 WatchSource:0}: Error finding container f4665f037c11721346b4ef0b9d0988274ca6a3b96dfb08dee2aac570ab091637: Status 404 returned error can't find the container with id f4665f037c11721346b4ef0b9d0988274ca6a3b96dfb08dee2aac570ab091637 Apr 23 16:43:15.059360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:15.059324 2580 generic.go:358] "Generic (PLEG): container finished" podID="9d261d73-9f66-4169-8360-c28ef43f3d30" containerID="611965d93cc3ff0ea4bb3e86b76835adab0087915eb29862fad11e937c9b1188" exitCode=0 Apr 23 16:43:15.059763 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:15.059413 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" event={"ID":"9d261d73-9f66-4169-8360-c28ef43f3d30","Type":"ContainerDied","Data":"611965d93cc3ff0ea4bb3e86b76835adab0087915eb29862fad11e937c9b1188"} Apr 23 16:43:15.059763 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:15.059450 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" event={"ID":"9d261d73-9f66-4169-8360-c28ef43f3d30","Type":"ContainerStarted","Data":"f4665f037c11721346b4ef0b9d0988274ca6a3b96dfb08dee2aac570ab091637"} Apr 23 16:43:22.087361 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:22.087326 2580 generic.go:358] "Generic (PLEG): container finished" podID="9d261d73-9f66-4169-8360-c28ef43f3d30" containerID="6e850e387ad8b21de09e8739d4779bc059c769a45ff70cfda0de409e9b57ce35" exitCode=0 Apr 23 16:43:22.087730 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:22.087379 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" event={"ID":"9d261d73-9f66-4169-8360-c28ef43f3d30","Type":"ContainerDied","Data":"6e850e387ad8b21de09e8739d4779bc059c769a45ff70cfda0de409e9b57ce35"} Apr 23 16:43:23.092677 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:23.092642 2580 generic.go:358] "Generic (PLEG): container finished" podID="9d261d73-9f66-4169-8360-c28ef43f3d30" containerID="99d92e1f060d7518dc36dea2e0a00d28d00d45781627c56c8fddb6aaba7d196f" exitCode=0 Apr 23 16:43:23.093041 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:23.092708 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" event={"ID":"9d261d73-9f66-4169-8360-c28ef43f3d30","Type":"ContainerDied","Data":"99d92e1f060d7518dc36dea2e0a00d28d00d45781627c56c8fddb6aaba7d196f"} Apr 23 16:43:24.213925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:24.213901 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:24.335985 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:24.335949 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-util\") pod \"9d261d73-9f66-4169-8360-c28ef43f3d30\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " Apr 23 16:43:24.336154 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:24.335999 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4p48\" (UniqueName: \"kubernetes.io/projected/9d261d73-9f66-4169-8360-c28ef43f3d30-kube-api-access-w4p48\") pod \"9d261d73-9f66-4169-8360-c28ef43f3d30\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " Apr 23 16:43:24.336154 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:24.336045 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-bundle\") pod \"9d261d73-9f66-4169-8360-c28ef43f3d30\" (UID: \"9d261d73-9f66-4169-8360-c28ef43f3d30\") " Apr 23 16:43:24.336773 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:24.336739 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-bundle" (OuterVolumeSpecName: "bundle") pod "9d261d73-9f66-4169-8360-c28ef43f3d30" (UID: "9d261d73-9f66-4169-8360-c28ef43f3d30"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:43:24.338139 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:24.338120 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d261d73-9f66-4169-8360-c28ef43f3d30-kube-api-access-w4p48" (OuterVolumeSpecName: "kube-api-access-w4p48") pod "9d261d73-9f66-4169-8360-c28ef43f3d30" (UID: "9d261d73-9f66-4169-8360-c28ef43f3d30"). InnerVolumeSpecName "kube-api-access-w4p48". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:43:24.342793 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:24.342755 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-util" (OuterVolumeSpecName: "util") pod "9d261d73-9f66-4169-8360-c28ef43f3d30" (UID: "9d261d73-9f66-4169-8360-c28ef43f3d30"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:43:24.436888 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:24.436796 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-util\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:43:24.436888 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:24.436836 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w4p48\" (UniqueName: \"kubernetes.io/projected/9d261d73-9f66-4169-8360-c28ef43f3d30-kube-api-access-w4p48\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:43:24.436888 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:24.436848 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d261d73-9f66-4169-8360-c28ef43f3d30-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:43:25.101650 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:25.101616 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" event={"ID":"9d261d73-9f66-4169-8360-c28ef43f3d30","Type":"ContainerDied","Data":"f4665f037c11721346b4ef0b9d0988274ca6a3b96dfb08dee2aac570ab091637"} Apr 23 16:43:25.101650 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:25.101652 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4665f037c11721346b4ef0b9d0988274ca6a3b96dfb08dee2aac570ab091637" Apr 23 16:43:25.101859 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:25.101625 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dm7s2l" Apr 23 16:43:41.035245 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.035212 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv"] Apr 23 16:43:41.035633 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.035546 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d261d73-9f66-4169-8360-c28ef43f3d30" containerName="util" Apr 23 16:43:41.035633 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.035557 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d261d73-9f66-4169-8360-c28ef43f3d30" containerName="util" Apr 23 16:43:41.035633 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.035564 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d261d73-9f66-4169-8360-c28ef43f3d30" containerName="extract" Apr 23 16:43:41.035633 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.035570 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d261d73-9f66-4169-8360-c28ef43f3d30" containerName="extract" Apr 23 16:43:41.035633 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.035577 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d261d73-9f66-4169-8360-c28ef43f3d30" containerName="pull" Apr 23 16:43:41.035633 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.035583 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d261d73-9f66-4169-8360-c28ef43f3d30" containerName="pull" Apr 23 16:43:41.035633 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.035629 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d261d73-9f66-4169-8360-c28ef43f3d30" containerName="extract" Apr 23 16:43:41.039802 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.039758 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:41.043147 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.043113 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 16:43:41.043286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.043112 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 16:43:41.043286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.043158 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5s2x5\"" Apr 23 16:43:41.050445 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.050422 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv"] Apr 23 16:43:41.068790 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.068752 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkcsk\" (UniqueName: \"kubernetes.io/projected/cb88b911-e82f-4d50-98c5-084d2ee39b70-kube-api-access-bkcsk\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:41.068910 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.068821 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:41.068910 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.068845 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:41.169395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.169363 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:41.169553 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.169413 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkcsk\" (UniqueName: \"kubernetes.io/projected/cb88b911-e82f-4d50-98c5-084d2ee39b70-kube-api-access-bkcsk\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:41.169553 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.169471 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:41.169844 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.169811 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:41.169886 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.169856 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:41.182640 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.182608 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkcsk\" (UniqueName: \"kubernetes.io/projected/cb88b911-e82f-4d50-98c5-084d2ee39b70-kube-api-access-bkcsk\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:41.349556 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.349471 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:41.471754 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:41.471715 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv"] Apr 23 16:43:41.475287 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:43:41.475253 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb88b911_e82f_4d50_98c5_084d2ee39b70.slice/crio-e96029e4fda2139b890e85dce309b2da827fbd829b056b010cec89a605d709b6 WatchSource:0}: Error finding container e96029e4fda2139b890e85dce309b2da827fbd829b056b010cec89a605d709b6: Status 404 returned error can't find the container with id e96029e4fda2139b890e85dce309b2da827fbd829b056b010cec89a605d709b6 Apr 23 16:43:42.161976 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:42.161935 2580 generic.go:358] "Generic (PLEG): container finished" podID="cb88b911-e82f-4d50-98c5-084d2ee39b70" containerID="1b732e2a8c8780e9f4dfae3f95971999cf4fdb7620c2d00f9016b09a7ac2f227" exitCode=0 Apr 23 16:43:42.162395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:42.162000 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" event={"ID":"cb88b911-e82f-4d50-98c5-084d2ee39b70","Type":"ContainerDied","Data":"1b732e2a8c8780e9f4dfae3f95971999cf4fdb7620c2d00f9016b09a7ac2f227"} Apr 23 16:43:42.162395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:42.162026 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" event={"ID":"cb88b911-e82f-4d50-98c5-084d2ee39b70","Type":"ContainerStarted","Data":"e96029e4fda2139b890e85dce309b2da827fbd829b056b010cec89a605d709b6"} Apr 23 16:43:45.173514 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:45.173422 2580 generic.go:358] "Generic (PLEG): container finished" podID="cb88b911-e82f-4d50-98c5-084d2ee39b70" containerID="74b005792590bde459fc715fa38eac7954a9cf4350b3ef96a014dd3ab76e5563" exitCode=0 Apr 23 16:43:45.173514 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:45.173501 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" event={"ID":"cb88b911-e82f-4d50-98c5-084d2ee39b70","Type":"ContainerDied","Data":"74b005792590bde459fc715fa38eac7954a9cf4350b3ef96a014dd3ab76e5563"} Apr 23 16:43:46.178951 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:46.178912 2580 generic.go:358] "Generic (PLEG): container finished" podID="cb88b911-e82f-4d50-98c5-084d2ee39b70" containerID="59f954b955fe239591540b3b66b6b452f302bd544107164ae3434a90b6bc2233" exitCode=0 Apr 23 16:43:46.179340 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:46.178996 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" event={"ID":"cb88b911-e82f-4d50-98c5-084d2ee39b70","Type":"ContainerDied","Data":"59f954b955fe239591540b3b66b6b452f302bd544107164ae3434a90b6bc2233"} Apr 23 16:43:47.149717 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.149681 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-74r9m"] Apr 23 16:43:47.153051 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.153033 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-74r9m" Apr 23 16:43:47.157117 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.157093 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-xnrfh\"" Apr 23 16:43:47.157253 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.157204 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 16:43:47.158248 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.158230 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 16:43:47.165359 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.165335 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-74r9m"] Apr 23 16:43:47.212178 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.212143 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jbl\" (UniqueName: \"kubernetes.io/projected/f1c41e54-1bf6-44d6-bb07-da2cde99f8a7-kube-api-access-88jbl\") pod \"cert-manager-79c8d999ff-74r9m\" (UID: \"f1c41e54-1bf6-44d6-bb07-da2cde99f8a7\") " pod="cert-manager/cert-manager-79c8d999ff-74r9m" Apr 23 16:43:47.212560 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.212217 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1c41e54-1bf6-44d6-bb07-da2cde99f8a7-bound-sa-token\") pod \"cert-manager-79c8d999ff-74r9m\" (UID: \"f1c41e54-1bf6-44d6-bb07-da2cde99f8a7\") " pod="cert-manager/cert-manager-79c8d999ff-74r9m" Apr 23 16:43:47.299620 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.299593 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:47.313157 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.313130 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-bundle\") pod \"cb88b911-e82f-4d50-98c5-084d2ee39b70\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " Apr 23 16:43:47.313312 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.313169 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-util\") pod \"cb88b911-e82f-4d50-98c5-084d2ee39b70\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " Apr 23 16:43:47.313312 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.313194 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkcsk\" (UniqueName: \"kubernetes.io/projected/cb88b911-e82f-4d50-98c5-084d2ee39b70-kube-api-access-bkcsk\") pod \"cb88b911-e82f-4d50-98c5-084d2ee39b70\" (UID: \"cb88b911-e82f-4d50-98c5-084d2ee39b70\") " Apr 23 16:43:47.313312 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.313269 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1c41e54-1bf6-44d6-bb07-da2cde99f8a7-bound-sa-token\") pod \"cert-manager-79c8d999ff-74r9m\" (UID: \"f1c41e54-1bf6-44d6-bb07-da2cde99f8a7\") " pod="cert-manager/cert-manager-79c8d999ff-74r9m" Apr 23 16:43:47.313489 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.313356 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88jbl\" (UniqueName: \"kubernetes.io/projected/f1c41e54-1bf6-44d6-bb07-da2cde99f8a7-kube-api-access-88jbl\") pod \"cert-manager-79c8d999ff-74r9m\" (UID: \"f1c41e54-1bf6-44d6-bb07-da2cde99f8a7\") " pod="cert-manager/cert-manager-79c8d999ff-74r9m" Apr 23 16:43:47.313609 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.313583 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-bundle" (OuterVolumeSpecName: "bundle") pod "cb88b911-e82f-4d50-98c5-084d2ee39b70" (UID: "cb88b911-e82f-4d50-98c5-084d2ee39b70"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:43:47.315305 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.315272 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb88b911-e82f-4d50-98c5-084d2ee39b70-kube-api-access-bkcsk" (OuterVolumeSpecName: "kube-api-access-bkcsk") pod "cb88b911-e82f-4d50-98c5-084d2ee39b70" (UID: "cb88b911-e82f-4d50-98c5-084d2ee39b70"). InnerVolumeSpecName "kube-api-access-bkcsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:43:47.318192 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.318161 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-util" (OuterVolumeSpecName: "util") pod "cb88b911-e82f-4d50-98c5-084d2ee39b70" (UID: "cb88b911-e82f-4d50-98c5-084d2ee39b70"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:43:47.330009 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.329982 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1c41e54-1bf6-44d6-bb07-da2cde99f8a7-bound-sa-token\") pod \"cert-manager-79c8d999ff-74r9m\" (UID: \"f1c41e54-1bf6-44d6-bb07-da2cde99f8a7\") " pod="cert-manager/cert-manager-79c8d999ff-74r9m" Apr 23 16:43:47.330877 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.330851 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jbl\" (UniqueName: \"kubernetes.io/projected/f1c41e54-1bf6-44d6-bb07-da2cde99f8a7-kube-api-access-88jbl\") pod \"cert-manager-79c8d999ff-74r9m\" (UID: \"f1c41e54-1bf6-44d6-bb07-da2cde99f8a7\") " pod="cert-manager/cert-manager-79c8d999ff-74r9m" Apr 23 16:43:47.413998 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.413924 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:43:47.413998 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.413950 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb88b911-e82f-4d50-98c5-084d2ee39b70-util\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:43:47.413998 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.413959 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bkcsk\" (UniqueName: \"kubernetes.io/projected/cb88b911-e82f-4d50-98c5-084d2ee39b70-kube-api-access-bkcsk\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:43:47.472574 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.472531 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-74r9m" Apr 23 16:43:47.592151 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:47.592118 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-74r9m"] Apr 23 16:43:47.594399 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:43:47.594371 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1c41e54_1bf6_44d6_bb07_da2cde99f8a7.slice/crio-3f0bb622048e703140a1f0a6785b1ff573e925e497d82c732a79b44ef130701e WatchSource:0}: Error finding container 3f0bb622048e703140a1f0a6785b1ff573e925e497d82c732a79b44ef130701e: Status 404 returned error can't find the container with id 3f0bb622048e703140a1f0a6785b1ff573e925e497d82c732a79b44ef130701e Apr 23 16:43:48.188911 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:48.188872 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" event={"ID":"cb88b911-e82f-4d50-98c5-084d2ee39b70","Type":"ContainerDied","Data":"e96029e4fda2139b890e85dce309b2da827fbd829b056b010cec89a605d709b6"} Apr 23 16:43:48.188911 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:48.188900 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtxjv" Apr 23 16:43:48.188911 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:48.188917 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e96029e4fda2139b890e85dce309b2da827fbd829b056b010cec89a605d709b6" Apr 23 16:43:48.189954 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:48.189928 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-74r9m" event={"ID":"f1c41e54-1bf6-44d6-bb07-da2cde99f8a7","Type":"ContainerStarted","Data":"3f0bb622048e703140a1f0a6785b1ff573e925e497d82c732a79b44ef130701e"} Apr 23 16:43:54.047201 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.047166 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7"] Apr 23 16:43:54.047582 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.047490 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb88b911-e82f-4d50-98c5-084d2ee39b70" containerName="util" Apr 23 16:43:54.047582 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.047503 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb88b911-e82f-4d50-98c5-084d2ee39b70" containerName="util" Apr 23 16:43:54.047582 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.047512 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb88b911-e82f-4d50-98c5-084d2ee39b70" containerName="pull" Apr 23 16:43:54.047582 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.047517 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb88b911-e82f-4d50-98c5-084d2ee39b70" containerName="pull" Apr 23 16:43:54.047582 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.047539 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb88b911-e82f-4d50-98c5-084d2ee39b70" containerName="extract" Apr 23 16:43:54.047582 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.047546 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb88b911-e82f-4d50-98c5-084d2ee39b70" containerName="extract" Apr 23 16:43:54.047820 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.047597 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb88b911-e82f-4d50-98c5-084d2ee39b70" containerName="extract" Apr 23 16:43:54.053037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.053018 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7" Apr 23 16:43:54.056263 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.056233 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-6tktn\"" Apr 23 16:43:54.057687 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.057663 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 23 16:43:54.057809 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.057718 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:43:54.060219 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.060197 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7"] Apr 23 16:43:54.063411 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.063391 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5064328f-c6a1-4ada-86d4-02b842b09a19-tmp\") pod \"openshift-lws-operator-bfc7f696d-wblb7\" (UID: \"5064328f-c6a1-4ada-86d4-02b842b09a19\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7" Apr 23 16:43:54.063509 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.063432 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbfkx\" (UniqueName: \"kubernetes.io/projected/5064328f-c6a1-4ada-86d4-02b842b09a19-kube-api-access-dbfkx\") pod \"openshift-lws-operator-bfc7f696d-wblb7\" (UID: \"5064328f-c6a1-4ada-86d4-02b842b09a19\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7" Apr 23 16:43:54.163890 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.163848 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5064328f-c6a1-4ada-86d4-02b842b09a19-tmp\") pod \"openshift-lws-operator-bfc7f696d-wblb7\" (UID: \"5064328f-c6a1-4ada-86d4-02b842b09a19\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7" Apr 23 16:43:54.164058 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.163901 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbfkx\" (UniqueName: \"kubernetes.io/projected/5064328f-c6a1-4ada-86d4-02b842b09a19-kube-api-access-dbfkx\") pod \"openshift-lws-operator-bfc7f696d-wblb7\" (UID: \"5064328f-c6a1-4ada-86d4-02b842b09a19\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7" Apr 23 16:43:54.164230 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.164206 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5064328f-c6a1-4ada-86d4-02b842b09a19-tmp\") pod \"openshift-lws-operator-bfc7f696d-wblb7\" (UID: \"5064328f-c6a1-4ada-86d4-02b842b09a19\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7" Apr 23 16:43:54.177047 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.177022 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbfkx\" (UniqueName: \"kubernetes.io/projected/5064328f-c6a1-4ada-86d4-02b842b09a19-kube-api-access-dbfkx\") pod \"openshift-lws-operator-bfc7f696d-wblb7\" (UID: \"5064328f-c6a1-4ada-86d4-02b842b09a19\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7" Apr 23 16:43:54.363819 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.363730 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7" Apr 23 16:43:54.485602 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:54.485578 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7"] Apr 23 16:43:54.488248 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:43:54.488221 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5064328f_c6a1_4ada_86d4_02b842b09a19.slice/crio-faec5870dec031b98ee891e11bedc3c6a9e6f10b5ef351fc47cad5cf32e80274 WatchSource:0}: Error finding container faec5870dec031b98ee891e11bedc3c6a9e6f10b5ef351fc47cad5cf32e80274: Status 404 returned error can't find the container with id faec5870dec031b98ee891e11bedc3c6a9e6f10b5ef351fc47cad5cf32e80274 Apr 23 16:43:55.215957 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:55.215918 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7" event={"ID":"5064328f-c6a1-4ada-86d4-02b842b09a19","Type":"ContainerStarted","Data":"faec5870dec031b98ee891e11bedc3c6a9e6f10b5ef351fc47cad5cf32e80274"} Apr 23 16:43:58.228978 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:58.228885 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7" event={"ID":"5064328f-c6a1-4ada-86d4-02b842b09a19","Type":"ContainerStarted","Data":"7b81e6c6209f5d2f8f51930eef11e32fbdaea785a186357f6da07598035a205d"} Apr 23 16:43:58.248008 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:43:58.247954 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wblb7" podStartSLOduration=1.3848920900000001 podStartE2EDuration="4.247939119s" podCreationTimestamp="2026-04-23 16:43:54 +0000 UTC" firstStartedPulling="2026-04-23 16:43:54.489653981 +0000 UTC m=+520.550728214" lastFinishedPulling="2026-04-23 16:43:57.352701006 +0000 UTC m=+523.413775243" observedRunningTime="2026-04-23 16:43:58.247274286 +0000 UTC m=+524.308348542" watchObservedRunningTime="2026-04-23 16:43:58.247939119 +0000 UTC m=+524.309013372" Apr 23 16:44:08.261587 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:08.261549 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-74r9m" event={"ID":"f1c41e54-1bf6-44d6-bb07-da2cde99f8a7","Type":"ContainerStarted","Data":"4fa787b7e6626c27efb0607775f791cf3fc7945298b4c44516fb35bc34b3006e"} Apr 23 16:44:08.277762 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:08.277710 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-74r9m" podStartSLOduration=1.587330895 podStartE2EDuration="21.277696048s" podCreationTimestamp="2026-04-23 16:43:47 +0000 UTC" firstStartedPulling="2026-04-23 16:43:47.596150139 +0000 UTC m=+513.657224371" lastFinishedPulling="2026-04-23 16:44:07.286515293 +0000 UTC m=+533.347589524" observedRunningTime="2026-04-23 16:44:08.276923657 +0000 UTC m=+534.337997911" watchObservedRunningTime="2026-04-23 16:44:08.277696048 +0000 UTC m=+534.338770302" Apr 23 16:44:11.433159 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.433122 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx"] Apr 23 16:44:11.436605 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.436589 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:11.439269 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.439241 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 16:44:11.439405 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.439251 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 16:44:11.440448 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.440426 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5s2x5\"" Apr 23 16:44:11.443513 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.443475 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx"] Apr 23 16:44:11.501484 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.501451 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:11.501640 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.501489 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:11.501640 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.501517 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgfhk\" (UniqueName: \"kubernetes.io/projected/570f4226-98c9-49c5-b65a-0fdd687c3926-kube-api-access-fgfhk\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:11.602889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.602854 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgfhk\" (UniqueName: \"kubernetes.io/projected/570f4226-98c9-49c5-b65a-0fdd687c3926-kube-api-access-fgfhk\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:11.603054 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.602953 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:11.603054 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.602993 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:11.603351 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.603331 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:11.603398 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.603368 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:11.611199 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.611175 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgfhk\" (UniqueName: \"kubernetes.io/projected/570f4226-98c9-49c5-b65a-0fdd687c3926-kube-api-access-fgfhk\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:11.746940 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.746905 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:11.873696 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:11.873600 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx"] Apr 23 16:44:11.876132 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:44:11.876102 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod570f4226_98c9_49c5_b65a_0fdd687c3926.slice/crio-e1b531775617822d26f9faa8d3d10ca93b4535fed2f02a8a3816bfb7e874a661 WatchSource:0}: Error finding container e1b531775617822d26f9faa8d3d10ca93b4535fed2f02a8a3816bfb7e874a661: Status 404 returned error can't find the container with id e1b531775617822d26f9faa8d3d10ca93b4535fed2f02a8a3816bfb7e874a661 Apr 23 16:44:12.277030 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:12.276991 2580 generic.go:358] "Generic (PLEG): container finished" podID="570f4226-98c9-49c5-b65a-0fdd687c3926" containerID="e9057b77b6e552a6000dc1a7e356cbb60d0bd759e16373a145df80c8e7893a00" exitCode=0 Apr 23 16:44:12.277196 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:12.277074 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" event={"ID":"570f4226-98c9-49c5-b65a-0fdd687c3926","Type":"ContainerDied","Data":"e9057b77b6e552a6000dc1a7e356cbb60d0bd759e16373a145df80c8e7893a00"} Apr 23 16:44:12.277196 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:12.277107 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" event={"ID":"570f4226-98c9-49c5-b65a-0fdd687c3926","Type":"ContainerStarted","Data":"e1b531775617822d26f9faa8d3d10ca93b4535fed2f02a8a3816bfb7e874a661"} Apr 23 16:44:13.282245 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:13.282213 2580 generic.go:358] "Generic (PLEG): container finished" podID="570f4226-98c9-49c5-b65a-0fdd687c3926" containerID="92de9080b3470455408c91431d2d6a3e90945ff5c7d8cfc5a0655c3f7e3fd274" exitCode=0 Apr 23 16:44:13.282624 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:13.282318 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" event={"ID":"570f4226-98c9-49c5-b65a-0fdd687c3926","Type":"ContainerDied","Data":"92de9080b3470455408c91431d2d6a3e90945ff5c7d8cfc5a0655c3f7e3fd274"} Apr 23 16:44:14.287106 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:14.287071 2580 generic.go:358] "Generic (PLEG): container finished" podID="570f4226-98c9-49c5-b65a-0fdd687c3926" containerID="3b84daf78942e9ca5ef7b58dfc3b987ef3abe916fb9fe196b088c3e3ae0c467e" exitCode=0 Apr 23 16:44:14.287499 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:14.287150 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" event={"ID":"570f4226-98c9-49c5-b65a-0fdd687c3926","Type":"ContainerDied","Data":"3b84daf78942e9ca5ef7b58dfc3b987ef3abe916fb9fe196b088c3e3ae0c467e"} Apr 23 16:44:15.417075 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:15.417049 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:15.540348 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:15.540315 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-util\") pod \"570f4226-98c9-49c5-b65a-0fdd687c3926\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " Apr 23 16:44:15.540525 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:15.540361 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-bundle\") pod \"570f4226-98c9-49c5-b65a-0fdd687c3926\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " Apr 23 16:44:15.540525 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:15.540382 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgfhk\" (UniqueName: \"kubernetes.io/projected/570f4226-98c9-49c5-b65a-0fdd687c3926-kube-api-access-fgfhk\") pod \"570f4226-98c9-49c5-b65a-0fdd687c3926\" (UID: \"570f4226-98c9-49c5-b65a-0fdd687c3926\") " Apr 23 16:44:15.541179 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:15.541127 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-bundle" (OuterVolumeSpecName: "bundle") pod "570f4226-98c9-49c5-b65a-0fdd687c3926" (UID: "570f4226-98c9-49c5-b65a-0fdd687c3926"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:44:15.542510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:15.542487 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/570f4226-98c9-49c5-b65a-0fdd687c3926-kube-api-access-fgfhk" (OuterVolumeSpecName: "kube-api-access-fgfhk") pod "570f4226-98c9-49c5-b65a-0fdd687c3926" (UID: "570f4226-98c9-49c5-b65a-0fdd687c3926"). InnerVolumeSpecName "kube-api-access-fgfhk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:44:15.545004 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:15.544983 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-util" (OuterVolumeSpecName: "util") pod "570f4226-98c9-49c5-b65a-0fdd687c3926" (UID: "570f4226-98c9-49c5-b65a-0fdd687c3926"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:44:15.641578 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:15.641544 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:44:15.641578 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:15.641574 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fgfhk\" (UniqueName: \"kubernetes.io/projected/570f4226-98c9-49c5-b65a-0fdd687c3926-kube-api-access-fgfhk\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:44:15.641578 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:15.641587 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/570f4226-98c9-49c5-b65a-0fdd687c3926-util\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:44:16.295647 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:16.295593 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" event={"ID":"570f4226-98c9-49c5-b65a-0fdd687c3926","Type":"ContainerDied","Data":"e1b531775617822d26f9faa8d3d10ca93b4535fed2f02a8a3816bfb7e874a661"} Apr 23 16:44:16.295647 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:16.295652 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b531775617822d26f9faa8d3d10ca93b4535fed2f02a8a3816bfb7e874a661" Apr 23 16:44:16.295855 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:16.295616 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dd9xx" Apr 23 16:44:21.364419 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.364386 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8"] Apr 23 16:44:21.364799 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.364715 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="570f4226-98c9-49c5-b65a-0fdd687c3926" containerName="pull" Apr 23 16:44:21.364799 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.364725 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="570f4226-98c9-49c5-b65a-0fdd687c3926" containerName="pull" Apr 23 16:44:21.364799 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.364742 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="570f4226-98c9-49c5-b65a-0fdd687c3926" containerName="extract" Apr 23 16:44:21.364799 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.364747 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="570f4226-98c9-49c5-b65a-0fdd687c3926" containerName="extract" Apr 23 16:44:21.364799 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.364756 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="570f4226-98c9-49c5-b65a-0fdd687c3926" containerName="util" Apr 23 16:44:21.364799 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.364761 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="570f4226-98c9-49c5-b65a-0fdd687c3926" containerName="util" Apr 23 16:44:21.365026 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.364817 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="570f4226-98c9-49c5-b65a-0fdd687c3926" containerName="extract" Apr 23 16:44:21.369073 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.369056 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:21.372081 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.372061 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 16:44:21.372161 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.372142 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 16:44:21.372228 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.372212 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5s2x5\"" Apr 23 16:44:21.378780 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.378748 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8"] Apr 23 16:44:21.491042 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.491002 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:21.491225 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.491070 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phhvf\" (UniqueName: \"kubernetes.io/projected/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-kube-api-access-phhvf\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:21.491225 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.491101 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:21.592099 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.592058 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:21.592328 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.592131 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:21.592328 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.592183 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phhvf\" (UniqueName: \"kubernetes.io/projected/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-kube-api-access-phhvf\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:21.592566 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.592537 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:21.592620 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.592544 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:21.601309 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.601257 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phhvf\" (UniqueName: \"kubernetes.io/projected/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-kube-api-access-phhvf\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:21.678601 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.678511 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:21.818628 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:21.818603 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8"] Apr 23 16:44:21.819110 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:44:21.819085 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93c0c903_b0d7_4b5d_b741_003f8bbbf1ca.slice/crio-aa4e6ca4d4fe78d7d8f736c4762385d33588239702f74cedfd25b1ab2f0c2b47 WatchSource:0}: Error finding container aa4e6ca4d4fe78d7d8f736c4762385d33588239702f74cedfd25b1ab2f0c2b47: Status 404 returned error can't find the container with id aa4e6ca4d4fe78d7d8f736c4762385d33588239702f74cedfd25b1ab2f0c2b47 Apr 23 16:44:22.317568 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.317533 2580 generic.go:358] "Generic (PLEG): container finished" podID="93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" containerID="c01cbbbddd00537a0b63372255b6552f233f3c8b780f00ba4b3b03df30eb042b" exitCode=0 Apr 23 16:44:22.317772 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.317621 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" event={"ID":"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca","Type":"ContainerDied","Data":"c01cbbbddd00537a0b63372255b6552f233f3c8b780f00ba4b3b03df30eb042b"} Apr 23 16:44:22.317772 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.317658 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" event={"ID":"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca","Type":"ContainerStarted","Data":"aa4e6ca4d4fe78d7d8f736c4762385d33588239702f74cedfd25b1ab2f0c2b47"} Apr 23 16:44:22.602513 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.602445 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-w42l6"] Apr 23 16:44:22.605814 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.605616 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" Apr 23 16:44:22.608360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.608339 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 23 16:44:22.608476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.608409 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 23 16:44:22.608476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.608451 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-wtdtd\"" Apr 23 16:44:22.618346 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.618323 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-w42l6"] Apr 23 16:44:22.701061 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.701026 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6ac56b2e-1396-4b77-b5b8-c46efbd0e606-operator-config\") pod \"servicemesh-operator3-55f49c5f94-w42l6\" (UID: \"6ac56b2e-1396-4b77-b5b8-c46efbd0e606\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" Apr 23 16:44:22.701061 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.701064 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jlvr\" (UniqueName: \"kubernetes.io/projected/6ac56b2e-1396-4b77-b5b8-c46efbd0e606-kube-api-access-7jlvr\") pod \"servicemesh-operator3-55f49c5f94-w42l6\" (UID: \"6ac56b2e-1396-4b77-b5b8-c46efbd0e606\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" Apr 23 16:44:22.801783 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.801744 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6ac56b2e-1396-4b77-b5b8-c46efbd0e606-operator-config\") pod \"servicemesh-operator3-55f49c5f94-w42l6\" (UID: \"6ac56b2e-1396-4b77-b5b8-c46efbd0e606\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" Apr 23 16:44:22.801978 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.801789 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jlvr\" (UniqueName: \"kubernetes.io/projected/6ac56b2e-1396-4b77-b5b8-c46efbd0e606-kube-api-access-7jlvr\") pod \"servicemesh-operator3-55f49c5f94-w42l6\" (UID: \"6ac56b2e-1396-4b77-b5b8-c46efbd0e606\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" Apr 23 16:44:22.804262 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.804239 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6ac56b2e-1396-4b77-b5b8-c46efbd0e606-operator-config\") pod \"servicemesh-operator3-55f49c5f94-w42l6\" (UID: \"6ac56b2e-1396-4b77-b5b8-c46efbd0e606\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" Apr 23 16:44:22.815780 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.815753 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jlvr\" (UniqueName: \"kubernetes.io/projected/6ac56b2e-1396-4b77-b5b8-c46efbd0e606-kube-api-access-7jlvr\") pod \"servicemesh-operator3-55f49c5f94-w42l6\" (UID: \"6ac56b2e-1396-4b77-b5b8-c46efbd0e606\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" Apr 23 16:44:22.916034 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:22.915948 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" Apr 23 16:44:23.043928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:23.043895 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-w42l6"] Apr 23 16:44:23.047174 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:44:23.047146 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ac56b2e_1396_4b77_b5b8_c46efbd0e606.slice/crio-55e98de318d129df91a967d2a26c2713ff58161a814e450595f5f208b87d902c WatchSource:0}: Error finding container 55e98de318d129df91a967d2a26c2713ff58161a814e450595f5f208b87d902c: Status 404 returned error can't find the container with id 55e98de318d129df91a967d2a26c2713ff58161a814e450595f5f208b87d902c Apr 23 16:44:23.322868 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:23.322804 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" event={"ID":"6ac56b2e-1396-4b77-b5b8-c46efbd0e606","Type":"ContainerStarted","Data":"55e98de318d129df91a967d2a26c2713ff58161a814e450595f5f208b87d902c"} Apr 23 16:44:24.327901 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:24.327868 2580 generic.go:358] "Generic (PLEG): container finished" podID="93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" containerID="daeec99b85731f6838e3e8636d6f978612e83f120a9efb2661fa02bfcde83b00" exitCode=0 Apr 23 16:44:24.328316 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:24.327915 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" event={"ID":"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca","Type":"ContainerDied","Data":"daeec99b85731f6838e3e8636d6f978612e83f120a9efb2661fa02bfcde83b00"} Apr 23 16:44:25.332964 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:25.332922 2580 generic.go:358] "Generic (PLEG): container finished" podID="93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" containerID="32f455fe8db3fdf7a92cf73011a02b791dea3e47dd1b8a077a34e278761d3012" exitCode=0 Apr 23 16:44:25.333364 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:25.332971 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" event={"ID":"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca","Type":"ContainerDied","Data":"32f455fe8db3fdf7a92cf73011a02b791dea3e47dd1b8a077a34e278761d3012"} Apr 23 16:44:26.464188 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:26.464165 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:26.534542 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:26.534505 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-bundle\") pod \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " Apr 23 16:44:26.534542 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:26.534544 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phhvf\" (UniqueName: \"kubernetes.io/projected/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-kube-api-access-phhvf\") pod \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " Apr 23 16:44:26.534792 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:26.534633 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-util\") pod \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\" (UID: \"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca\") " Apr 23 16:44:26.535575 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:26.535518 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-bundle" (OuterVolumeSpecName: "bundle") pod "93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" (UID: "93c0c903-b0d7-4b5d-b741-003f8bbbf1ca"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:44:26.536817 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:26.536791 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-kube-api-access-phhvf" (OuterVolumeSpecName: "kube-api-access-phhvf") pod "93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" (UID: "93c0c903-b0d7-4b5d-b741-003f8bbbf1ca"). InnerVolumeSpecName "kube-api-access-phhvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:44:26.542861 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:26.542814 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-util" (OuterVolumeSpecName: "util") pod "93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" (UID: "93c0c903-b0d7-4b5d-b741-003f8bbbf1ca"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:44:26.636241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:26.636150 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-util\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:44:26.636241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:26.636179 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:44:26.636241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:26.636192 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-phhvf\" (UniqueName: \"kubernetes.io/projected/93c0c903-b0d7-4b5d-b741-003f8bbbf1ca-kube-api-access-phhvf\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:44:27.341213 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:27.341180 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" Apr 23 16:44:27.341399 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:27.341180 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb52qj8" event={"ID":"93c0c903-b0d7-4b5d-b741-003f8bbbf1ca","Type":"ContainerDied","Data":"aa4e6ca4d4fe78d7d8f736c4762385d33588239702f74cedfd25b1ab2f0c2b47"} Apr 23 16:44:27.341399 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:27.341316 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa4e6ca4d4fe78d7d8f736c4762385d33588239702f74cedfd25b1ab2f0c2b47" Apr 23 16:44:42.347960 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.347922 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc"] Apr 23 16:44:42.348457 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.348408 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" containerName="pull" Apr 23 16:44:42.348457 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.348428 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" containerName="pull" Apr 23 16:44:42.348457 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.348453 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" containerName="util" Apr 23 16:44:42.348608 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.348463 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" containerName="util" Apr 23 16:44:42.348608 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.348481 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" containerName="extract" Apr 23 16:44:42.348608 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.348492 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" containerName="extract" Apr 23 16:44:42.348608 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.348576 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="93c0c903-b0d7-4b5d-b741-003f8bbbf1ca" containerName="extract" Apr 23 16:44:42.351793 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.351771 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.354464 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.354438 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 23 16:44:42.354563 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.354438 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 23 16:44:42.354722 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.354700 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 23 16:44:42.354722 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.354715 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 16:44:42.354882 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.354713 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-xc4l6\"" Apr 23 16:44:42.355170 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.355141 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 16:44:42.355502 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.355487 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 23 16:44:42.363019 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.362997 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc"] Apr 23 16:44:42.393669 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.393632 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" event={"ID":"6ac56b2e-1396-4b77-b5b8-c46efbd0e606","Type":"ContainerStarted","Data":"0c2ba29203b4edca6e5e725a9e97079c1e2b1497b9850676b8dd4eea5b66c807"} Apr 23 16:44:42.393829 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.393754 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" Apr 23 16:44:42.415365 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.415281 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" podStartSLOduration=2.046790147 podStartE2EDuration="20.415263448s" podCreationTimestamp="2026-04-23 16:44:22 +0000 UTC" firstStartedPulling="2026-04-23 16:44:23.049705424 +0000 UTC m=+549.110779660" lastFinishedPulling="2026-04-23 16:44:41.41817873 +0000 UTC m=+567.479252961" observedRunningTime="2026-04-23 16:44:42.414566814 +0000 UTC m=+568.475641080" watchObservedRunningTime="2026-04-23 16:44:42.415263448 +0000 UTC m=+568.476337702" Apr 23 16:44:42.473760 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.473722 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.473760 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.473771 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwvzb\" (UniqueName: \"kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-kube-api-access-gwvzb\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.473982 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.473858 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.473982 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.473900 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/16ea3634-70d5-4dbb-b92a-56a222a8bebe-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.473982 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.473918 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.473982 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.473957 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.474129 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.474000 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.574586 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.574544 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwvzb\" (UniqueName: \"kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-kube-api-access-gwvzb\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.574775 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.574599 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.574775 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.574636 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/16ea3634-70d5-4dbb-b92a-56a222a8bebe-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.574775 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.574665 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.574775 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.574703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.574775 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.574740 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.575038 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.574783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.575434 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.575376 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.577555 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.577532 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/16ea3634-70d5-4dbb-b92a-56a222a8bebe-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.577706 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.577686 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.577815 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.577797 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.577881 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.577817 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.583007 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.582986 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.583179 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.583157 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwvzb\" (UniqueName: \"kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-kube-api-access-gwvzb\") pod \"istiod-openshift-gateway-7cd77c7ffd-mppdc\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.662731 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.662628 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:42.796594 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:42.796567 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc"] Apr 23 16:44:42.799401 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:44:42.799371 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16ea3634_70d5_4dbb_b92a_56a222a8bebe.slice/crio-bcb87fb54d68f80e11a0d7af9f0f8e37d456eedfb1b11e58b2a95cb418d8a434 WatchSource:0}: Error finding container bcb87fb54d68f80e11a0d7af9f0f8e37d456eedfb1b11e58b2a95cb418d8a434: Status 404 returned error can't find the container with id bcb87fb54d68f80e11a0d7af9f0f8e37d456eedfb1b11e58b2a95cb418d8a434 Apr 23 16:44:43.397938 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:43.397896 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" event={"ID":"16ea3634-70d5-4dbb-b92a-56a222a8bebe","Type":"ContainerStarted","Data":"bcb87fb54d68f80e11a0d7af9f0f8e37d456eedfb1b11e58b2a95cb418d8a434"} Apr 23 16:44:46.717687 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:46.717653 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d4d9b9949-zcf8z"] Apr 23 16:44:46.721000 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:46.720980 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:46.730626 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:46.730602 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d4d9b9949-zcf8z"] Apr 23 16:44:46.909670 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:46.909616 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e1a41ee-fc33-41dd-953f-d37e45dacbee-console-oauth-config\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:46.909670 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:46.909669 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnjjk\" (UniqueName: \"kubernetes.io/projected/6e1a41ee-fc33-41dd-953f-d37e45dacbee-kube-api-access-cnjjk\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:46.909890 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:46.909699 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e1a41ee-fc33-41dd-953f-d37e45dacbee-console-serving-cert\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:46.909890 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:46.909743 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-oauth-serving-cert\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:46.909890 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:46.909766 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-trusted-ca-bundle\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:46.909890 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:46.909788 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-service-ca\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:46.909890 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:46.909811 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-console-config\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.011157 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.011061 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-service-ca\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.011157 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.011129 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-console-config\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.011444 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.011186 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e1a41ee-fc33-41dd-953f-d37e45dacbee-console-oauth-config\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.011444 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.011221 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnjjk\" (UniqueName: \"kubernetes.io/projected/6e1a41ee-fc33-41dd-953f-d37e45dacbee-kube-api-access-cnjjk\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.011444 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.011267 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e1a41ee-fc33-41dd-953f-d37e45dacbee-console-serving-cert\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.011444 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.011330 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-oauth-serving-cert\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.011444 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.011369 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-trusted-ca-bundle\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.011921 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.011894 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-service-ca\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.012012 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.011895 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-console-config\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.012142 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.012118 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-oauth-serving-cert\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.012202 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.012185 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e1a41ee-fc33-41dd-953f-d37e45dacbee-trusted-ca-bundle\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.014057 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.014029 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e1a41ee-fc33-41dd-953f-d37e45dacbee-console-oauth-config\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.014175 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.014117 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e1a41ee-fc33-41dd-953f-d37e45dacbee-console-serving-cert\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.020802 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.020777 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnjjk\" (UniqueName: \"kubernetes.io/projected/6e1a41ee-fc33-41dd-953f-d37e45dacbee-kube-api-access-cnjjk\") pod \"console-d4d9b9949-zcf8z\" (UID: \"6e1a41ee-fc33-41dd-953f-d37e45dacbee\") " pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.030438 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.030418 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:47.158652 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.158621 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d4d9b9949-zcf8z"] Apr 23 16:44:47.161754 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:44:47.161718 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e1a41ee_fc33_41dd_953f_d37e45dacbee.slice/crio-56376b5509cf305466772e4d5e7f83d4637807682bc3860b08eb808e2349849f WatchSource:0}: Error finding container 56376b5509cf305466772e4d5e7f83d4637807682bc3860b08eb808e2349849f: Status 404 returned error can't find the container with id 56376b5509cf305466772e4d5e7f83d4637807682bc3860b08eb808e2349849f Apr 23 16:44:47.413492 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.413406 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4d9b9949-zcf8z" event={"ID":"6e1a41ee-fc33-41dd-953f-d37e45dacbee","Type":"ContainerStarted","Data":"e06a92e559c006d0d982eb06fe554bc34d830c717135dfb244896cb11d035100"} Apr 23 16:44:47.413492 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.413445 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4d9b9949-zcf8z" event={"ID":"6e1a41ee-fc33-41dd-953f-d37e45dacbee","Type":"ContainerStarted","Data":"56376b5509cf305466772e4d5e7f83d4637807682bc3860b08eb808e2349849f"} Apr 23 16:44:47.433117 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:47.433068 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d4d9b9949-zcf8z" podStartSLOduration=1.433053862 podStartE2EDuration="1.433053862s" podCreationTimestamp="2026-04-23 16:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:44:47.431470386 +0000 UTC m=+573.492544642" watchObservedRunningTime="2026-04-23 16:44:47.433053862 +0000 UTC m=+573.494128116" Apr 23 16:44:53.390340 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:53.390282 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 23 16:44:53.390594 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:53.390386 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 23 16:44:53.400013 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:53.399992 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-w42l6" Apr 23 16:44:54.444925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:54.443573 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" event={"ID":"16ea3634-70d5-4dbb-b92a-56a222a8bebe","Type":"ContainerStarted","Data":"63d51fdfa0100c66432086320efe078843a9f7a76f15ba817b62d45a45099aeb"} Apr 23 16:44:54.444925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:54.444864 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:54.446240 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:54.446213 2580 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-mppdc container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 23 16:44:54.446387 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:54.446262 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" podUID="16ea3634-70d5-4dbb-b92a-56a222a8bebe" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:54.476320 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:54.473130 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" podStartSLOduration=1.884639155 podStartE2EDuration="12.473112018s" podCreationTimestamp="2026-04-23 16:44:42 +0000 UTC" firstStartedPulling="2026-04-23 16:44:42.801525915 +0000 UTC m=+568.862600161" lastFinishedPulling="2026-04-23 16:44:53.38999879 +0000 UTC m=+579.451073024" observedRunningTime="2026-04-23 16:44:54.469361831 +0000 UTC m=+580.530436085" watchObservedRunningTime="2026-04-23 16:44:54.473112018 +0000 UTC m=+580.534186272" Apr 23 16:44:55.447795 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:55.447736 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:44:57.031396 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:57.031363 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:57.031815 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:57.031603 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:57.038210 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:57.038188 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:57.457894 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:57.457809 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d4d9b9949-zcf8z" Apr 23 16:44:57.507945 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:44:57.507911 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66bd4b4847-f82xw"] Apr 23 16:45:03.577641 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.577601 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl"] Apr 23 16:45:03.580998 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.580979 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:03.583777 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.583757 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 16:45:03.583891 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.583754 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 16:45:03.584836 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.584818 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5s2x5\"" Apr 23 16:45:03.588998 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.588977 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl"] Apr 23 16:45:03.656354 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.656320 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69775\" (UniqueName: \"kubernetes.io/projected/d8dae3ca-89b7-4da1-b2ce-754f4271850d-kube-api-access-69775\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:03.656524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.656428 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:03.656524 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.656477 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:03.673381 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.673349 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28"] Apr 23 16:45:03.676785 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.676768 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:03.682993 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.682971 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28"] Apr 23 16:45:03.757785 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.757754 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:03.757969 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.757801 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69775\" (UniqueName: \"kubernetes.io/projected/d8dae3ca-89b7-4da1-b2ce-754f4271850d-kube-api-access-69775\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:03.757969 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.757866 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znb7x\" (UniqueName: \"kubernetes.io/projected/2ff500bc-4527-4d6d-9733-cb661b777ae1-kube-api-access-znb7x\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:03.757969 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.757931 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:03.757969 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.757954 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:03.758120 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.758020 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:03.758395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.758377 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:03.758439 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.758389 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:03.767227 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.767202 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69775\" (UniqueName: \"kubernetes.io/projected/d8dae3ca-89b7-4da1-b2ce-754f4271850d-kube-api-access-69775\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:03.776132 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.776109 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw"] Apr 23 16:45:03.779631 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.779616 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:03.785791 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.785771 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw"] Apr 23 16:45:03.858700 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.858602 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:03.858700 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.858693 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:03.858899 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.858754 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znb7x\" (UniqueName: \"kubernetes.io/projected/2ff500bc-4527-4d6d-9733-cb661b777ae1-kube-api-access-znb7x\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:03.858899 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.858787 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:03.858899 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.858814 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd4cn\" (UniqueName: \"kubernetes.io/projected/56911694-2580-4427-953f-99f4d5771862-kube-api-access-cd4cn\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:03.859022 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.858904 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:03.859056 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.859032 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:03.859095 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.859066 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:03.867598 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.867571 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znb7x\" (UniqueName: \"kubernetes.io/projected/2ff500bc-4527-4d6d-9733-cb661b777ae1-kube-api-access-znb7x\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:03.872548 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.872526 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch"] Apr 23 16:45:03.876126 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.876108 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:03.883168 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.883146 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch"] Apr 23 16:45:03.890867 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.890845 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:03.959789 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.959750 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:03.959956 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.959896 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:03.959956 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.959941 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5dkd\" (UniqueName: \"kubernetes.io/projected/4b939e50-5761-4424-a598-047b9627a397-kube-api-access-h5dkd\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:03.960106 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.959987 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:03.960106 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.960017 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd4cn\" (UniqueName: \"kubernetes.io/projected/56911694-2580-4427-953f-99f4d5771862-kube-api-access-cd4cn\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:03.960106 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.960059 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:03.960262 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.960196 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:03.960262 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.960242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:03.971741 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.969671 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd4cn\" (UniqueName: \"kubernetes.io/projected/56911694-2580-4427-953f-99f4d5771862-kube-api-access-cd4cn\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:03.987459 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:03.987424 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:04.017881 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.017841 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl"] Apr 23 16:45:04.018974 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:45:04.018944 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8dae3ca_89b7_4da1_b2ce_754f4271850d.slice/crio-53a7d519fccfc339a37becc57b217e69cbaea4d1723d9e0ccfd7c014427f7ad6 WatchSource:0}: Error finding container 53a7d519fccfc339a37becc57b217e69cbaea4d1723d9e0ccfd7c014427f7ad6: Status 404 returned error can't find the container with id 53a7d519fccfc339a37becc57b217e69cbaea4d1723d9e0ccfd7c014427f7ad6 Apr 23 16:45:04.060707 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.060675 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:04.060837 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.060798 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:04.060886 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.060834 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5dkd\" (UniqueName: \"kubernetes.io/projected/4b939e50-5761-4424-a598-047b9627a397-kube-api-access-h5dkd\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:04.061077 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.061057 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:04.061147 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.061128 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:04.071070 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.070975 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5dkd\" (UniqueName: \"kubernetes.io/projected/4b939e50-5761-4424-a598-047b9627a397-kube-api-access-h5dkd\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:04.097067 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.097002 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:04.119343 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.119261 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28"] Apr 23 16:45:04.121777 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:45:04.121740 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ff500bc_4527_4d6d_9733_cb661b777ae1.slice/crio-1303d1f2656683824d3ebcfe4df9a287a83ea94767e7b0188dcc161ab5c18a35 WatchSource:0}: Error finding container 1303d1f2656683824d3ebcfe4df9a287a83ea94767e7b0188dcc161ab5c18a35: Status 404 returned error can't find the container with id 1303d1f2656683824d3ebcfe4df9a287a83ea94767e7b0188dcc161ab5c18a35 Apr 23 16:45:04.186847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.186809 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:04.233190 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.233161 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw"] Apr 23 16:45:04.234871 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:45:04.234840 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56911694_2580_4427_953f_99f4d5771862.slice/crio-4d6b2d7958e4da594e805986ed9fb8c0174634048c124fa745b78ffda7c6cce3 WatchSource:0}: Error finding container 4d6b2d7958e4da594e805986ed9fb8c0174634048c124fa745b78ffda7c6cce3: Status 404 returned error can't find the container with id 4d6b2d7958e4da594e805986ed9fb8c0174634048c124fa745b78ffda7c6cce3 Apr 23 16:45:04.319925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.319899 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch"] Apr 23 16:45:04.332482 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:45:04.332450 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b939e50_5761_4424_a598_047b9627a397.slice/crio-f9ee8104dac45f45a196250da4cf882f7b66ef1d6618b8e2126d0537c4a51663 WatchSource:0}: Error finding container f9ee8104dac45f45a196250da4cf882f7b66ef1d6618b8e2126d0537c4a51663: Status 404 returned error can't find the container with id f9ee8104dac45f45a196250da4cf882f7b66ef1d6618b8e2126d0537c4a51663 Apr 23 16:45:04.478135 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.478102 2580 generic.go:358] "Generic (PLEG): container finished" podID="2ff500bc-4527-4d6d-9733-cb661b777ae1" containerID="b2628016aaff81c559066806698adc4550d5b329a3fe196fe40ce201ffb390d2" exitCode=0 Apr 23 16:45:04.478327 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.478180 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" event={"ID":"2ff500bc-4527-4d6d-9733-cb661b777ae1","Type":"ContainerDied","Data":"b2628016aaff81c559066806698adc4550d5b329a3fe196fe40ce201ffb390d2"} Apr 23 16:45:04.478327 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.478219 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" event={"ID":"2ff500bc-4527-4d6d-9733-cb661b777ae1","Type":"ContainerStarted","Data":"1303d1f2656683824d3ebcfe4df9a287a83ea94767e7b0188dcc161ab5c18a35"} Apr 23 16:45:04.479640 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.479625 2580 generic.go:358] "Generic (PLEG): container finished" podID="d8dae3ca-89b7-4da1-b2ce-754f4271850d" containerID="55b9a399b11042e10892e4870bca1ee6b9ce04a510cdda0364718977cabfa250" exitCode=0 Apr 23 16:45:04.479724 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.479707 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" event={"ID":"d8dae3ca-89b7-4da1-b2ce-754f4271850d","Type":"ContainerDied","Data":"55b9a399b11042e10892e4870bca1ee6b9ce04a510cdda0364718977cabfa250"} Apr 23 16:45:04.479771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.479734 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" event={"ID":"d8dae3ca-89b7-4da1-b2ce-754f4271850d","Type":"ContainerStarted","Data":"53a7d519fccfc339a37becc57b217e69cbaea4d1723d9e0ccfd7c014427f7ad6"} Apr 23 16:45:04.481136 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.481115 2580 generic.go:358] "Generic (PLEG): container finished" podID="4b939e50-5761-4424-a598-047b9627a397" containerID="7e3506bd4615ec710d45247e2ca33ecdd78e941c2498cac4b3bba24f06604488" exitCode=0 Apr 23 16:45:04.481242 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.481171 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" event={"ID":"4b939e50-5761-4424-a598-047b9627a397","Type":"ContainerDied","Data":"7e3506bd4615ec710d45247e2ca33ecdd78e941c2498cac4b3bba24f06604488"} Apr 23 16:45:04.481242 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.481190 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" event={"ID":"4b939e50-5761-4424-a598-047b9627a397","Type":"ContainerStarted","Data":"f9ee8104dac45f45a196250da4cf882f7b66ef1d6618b8e2126d0537c4a51663"} Apr 23 16:45:04.482832 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.482814 2580 generic.go:358] "Generic (PLEG): container finished" podID="56911694-2580-4427-953f-99f4d5771862" containerID="6bc9c81aac81c76a2b0fc3076a7cc6755657f02e84659c7cc4daf5c0e052f4c3" exitCode=0 Apr 23 16:45:04.482922 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.482884 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" event={"ID":"56911694-2580-4427-953f-99f4d5771862","Type":"ContainerDied","Data":"6bc9c81aac81c76a2b0fc3076a7cc6755657f02e84659c7cc4daf5c0e052f4c3"} Apr 23 16:45:04.482922 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:04.482913 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" event={"ID":"56911694-2580-4427-953f-99f4d5771862","Type":"ContainerStarted","Data":"4d6b2d7958e4da594e805986ed9fb8c0174634048c124fa745b78ffda7c6cce3"} Apr 23 16:45:06.497613 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:06.497578 2580 generic.go:358] "Generic (PLEG): container finished" podID="2ff500bc-4527-4d6d-9733-cb661b777ae1" containerID="d91780a6a3779919df92cc97350256222d1300a6600eece2d1577768b551f630" exitCode=0 Apr 23 16:45:06.498094 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:06.497658 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" event={"ID":"2ff500bc-4527-4d6d-9733-cb661b777ae1","Type":"ContainerDied","Data":"d91780a6a3779919df92cc97350256222d1300a6600eece2d1577768b551f630"} Apr 23 16:45:06.499658 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:06.499636 2580 generic.go:358] "Generic (PLEG): container finished" podID="d8dae3ca-89b7-4da1-b2ce-754f4271850d" containerID="3e81822ac654bc7dd07f13da5f4ae41c61b2c91de91d206ff2ccdc63c9bcd10e" exitCode=0 Apr 23 16:45:06.499742 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:06.499718 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" event={"ID":"d8dae3ca-89b7-4da1-b2ce-754f4271850d","Type":"ContainerDied","Data":"3e81822ac654bc7dd07f13da5f4ae41c61b2c91de91d206ff2ccdc63c9bcd10e"} Apr 23 16:45:06.501552 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:06.501530 2580 generic.go:358] "Generic (PLEG): container finished" podID="4b939e50-5761-4424-a598-047b9627a397" containerID="f2240c2a0e012a5afe289b5142929695b277f8ca8c5a2cca2f670fbda7e76183" exitCode=0 Apr 23 16:45:06.501652 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:06.501554 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" event={"ID":"4b939e50-5761-4424-a598-047b9627a397","Type":"ContainerDied","Data":"f2240c2a0e012a5afe289b5142929695b277f8ca8c5a2cca2f670fbda7e76183"} Apr 23 16:45:06.503374 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:06.503261 2580 generic.go:358] "Generic (PLEG): container finished" podID="56911694-2580-4427-953f-99f4d5771862" containerID="389ae23243331e90af6b9711fb8ae1c4727f6e4e524a09572d7d1718af91d9c5" exitCode=0 Apr 23 16:45:06.503374 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:06.503319 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" event={"ID":"56911694-2580-4427-953f-99f4d5771862","Type":"ContainerDied","Data":"389ae23243331e90af6b9711fb8ae1c4727f6e4e524a09572d7d1718af91d9c5"} Apr 23 16:45:07.509442 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:07.509409 2580 generic.go:358] "Generic (PLEG): container finished" podID="2ff500bc-4527-4d6d-9733-cb661b777ae1" containerID="be448819b2aff9acf2564fa37b9e7b26343d89855e2a4951998c8f44b3244aca" exitCode=0 Apr 23 16:45:07.509882 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:07.509496 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" event={"ID":"2ff500bc-4527-4d6d-9733-cb661b777ae1","Type":"ContainerDied","Data":"be448819b2aff9acf2564fa37b9e7b26343d89855e2a4951998c8f44b3244aca"} Apr 23 16:45:07.511403 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:07.511378 2580 generic.go:358] "Generic (PLEG): container finished" podID="d8dae3ca-89b7-4da1-b2ce-754f4271850d" containerID="9f8676f2402d0eb3d14c22622d5e09a98f23324525d70c37446b46f5e5ff4f2b" exitCode=0 Apr 23 16:45:07.511518 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:07.511458 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" event={"ID":"d8dae3ca-89b7-4da1-b2ce-754f4271850d","Type":"ContainerDied","Data":"9f8676f2402d0eb3d14c22622d5e09a98f23324525d70c37446b46f5e5ff4f2b"} Apr 23 16:45:07.513171 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:07.513148 2580 generic.go:358] "Generic (PLEG): container finished" podID="4b939e50-5761-4424-a598-047b9627a397" containerID="c996d7e31f0a7d0c97366977599604d9d9e7358ccb485126c984e95c8a88d83c" exitCode=0 Apr 23 16:45:07.513267 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:07.513176 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" event={"ID":"4b939e50-5761-4424-a598-047b9627a397","Type":"ContainerDied","Data":"c996d7e31f0a7d0c97366977599604d9d9e7358ccb485126c984e95c8a88d83c"} Apr 23 16:45:07.514945 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:07.514926 2580 generic.go:358] "Generic (PLEG): container finished" podID="56911694-2580-4427-953f-99f4d5771862" containerID="7d930fa4c984b4472197066ff70a92317488d88a07427fde85cc6834d95e5cb4" exitCode=0 Apr 23 16:45:07.515038 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:07.514975 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" event={"ID":"56911694-2580-4427-953f-99f4d5771862","Type":"ContainerDied","Data":"7d930fa4c984b4472197066ff70a92317488d88a07427fde85cc6834d95e5cb4"} Apr 23 16:45:08.665029 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.665004 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:08.709067 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.709042 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:08.737518 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.737488 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:08.740757 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.740734 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:08.805152 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.805081 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69775\" (UniqueName: \"kubernetes.io/projected/d8dae3ca-89b7-4da1-b2ce-754f4271850d-kube-api-access-69775\") pod \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " Apr 23 16:45:08.805152 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.805123 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5dkd\" (UniqueName: \"kubernetes.io/projected/4b939e50-5761-4424-a598-047b9627a397-kube-api-access-h5dkd\") pod \"4b939e50-5761-4424-a598-047b9627a397\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " Apr 23 16:45:08.805152 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.805148 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-bundle\") pod \"56911694-2580-4427-953f-99f4d5771862\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " Apr 23 16:45:08.805446 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.805175 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-bundle\") pod \"2ff500bc-4527-4d6d-9733-cb661b777ae1\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " Apr 23 16:45:08.805446 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.805203 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-bundle\") pod \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " Apr 23 16:45:08.805446 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.805221 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-util\") pod \"2ff500bc-4527-4d6d-9733-cb661b777ae1\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " Apr 23 16:45:08.805446 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.805238 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-util\") pod \"56911694-2580-4427-953f-99f4d5771862\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " Apr 23 16:45:08.805446 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.805265 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znb7x\" (UniqueName: \"kubernetes.io/projected/2ff500bc-4527-4d6d-9733-cb661b777ae1-kube-api-access-znb7x\") pod \"2ff500bc-4527-4d6d-9733-cb661b777ae1\" (UID: \"2ff500bc-4527-4d6d-9733-cb661b777ae1\") " Apr 23 16:45:08.805962 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.805928 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-bundle" (OuterVolumeSpecName: "bundle") pod "56911694-2580-4427-953f-99f4d5771862" (UID: "56911694-2580-4427-953f-99f4d5771862"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:45:08.806058 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.805940 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-bundle" (OuterVolumeSpecName: "bundle") pod "d8dae3ca-89b7-4da1-b2ce-754f4271850d" (UID: "d8dae3ca-89b7-4da1-b2ce-754f4271850d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:45:08.806104 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.806061 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-bundle" (OuterVolumeSpecName: "bundle") pod "2ff500bc-4527-4d6d-9733-cb661b777ae1" (UID: "2ff500bc-4527-4d6d-9733-cb661b777ae1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:45:08.806104 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.806066 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-bundle\") pod \"4b939e50-5761-4424-a598-047b9627a397\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " Apr 23 16:45:08.806242 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.806129 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-util\") pod \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\" (UID: \"d8dae3ca-89b7-4da1-b2ce-754f4271850d\") " Apr 23 16:45:08.806242 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.806171 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd4cn\" (UniqueName: \"kubernetes.io/projected/56911694-2580-4427-953f-99f4d5771862-kube-api-access-cd4cn\") pod \"56911694-2580-4427-953f-99f4d5771862\" (UID: \"56911694-2580-4427-953f-99f4d5771862\") " Apr 23 16:45:08.806392 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.806251 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-util\") pod \"4b939e50-5761-4424-a598-047b9627a397\" (UID: \"4b939e50-5761-4424-a598-047b9627a397\") " Apr 23 16:45:08.806619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.806562 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:08.806619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.806583 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:08.806619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.806596 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:08.807802 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.807775 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff500bc-4527-4d6d-9733-cb661b777ae1-kube-api-access-znb7x" (OuterVolumeSpecName: "kube-api-access-znb7x") pod "2ff500bc-4527-4d6d-9733-cb661b777ae1" (UID: "2ff500bc-4527-4d6d-9733-cb661b777ae1"). InnerVolumeSpecName "kube-api-access-znb7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:45:08.807986 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.807959 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-bundle" (OuterVolumeSpecName: "bundle") pod "4b939e50-5761-4424-a598-047b9627a397" (UID: "4b939e50-5761-4424-a598-047b9627a397"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:45:08.808102 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.808075 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8dae3ca-89b7-4da1-b2ce-754f4271850d-kube-api-access-69775" (OuterVolumeSpecName: "kube-api-access-69775") pod "d8dae3ca-89b7-4da1-b2ce-754f4271850d" (UID: "d8dae3ca-89b7-4da1-b2ce-754f4271850d"). InnerVolumeSpecName "kube-api-access-69775". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:45:08.809029 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.809004 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56911694-2580-4427-953f-99f4d5771862-kube-api-access-cd4cn" (OuterVolumeSpecName: "kube-api-access-cd4cn") pod "56911694-2580-4427-953f-99f4d5771862" (UID: "56911694-2580-4427-953f-99f4d5771862"). InnerVolumeSpecName "kube-api-access-cd4cn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:45:08.809287 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.809259 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b939e50-5761-4424-a598-047b9627a397-kube-api-access-h5dkd" (OuterVolumeSpecName: "kube-api-access-h5dkd") pod "4b939e50-5761-4424-a598-047b9627a397" (UID: "4b939e50-5761-4424-a598-047b9627a397"). InnerVolumeSpecName "kube-api-access-h5dkd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:45:08.812469 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.812443 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-util" (OuterVolumeSpecName: "util") pod "2ff500bc-4527-4d6d-9733-cb661b777ae1" (UID: "2ff500bc-4527-4d6d-9733-cb661b777ae1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:45:08.813107 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.813082 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-util" (OuterVolumeSpecName: "util") pod "4b939e50-5761-4424-a598-047b9627a397" (UID: "4b939e50-5761-4424-a598-047b9627a397"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:45:08.813273 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.813254 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-util" (OuterVolumeSpecName: "util") pod "d8dae3ca-89b7-4da1-b2ce-754f4271850d" (UID: "d8dae3ca-89b7-4da1-b2ce-754f4271850d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:45:08.815628 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.815593 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-util" (OuterVolumeSpecName: "util") pod "56911694-2580-4427-953f-99f4d5771862" (UID: "56911694-2580-4427-953f-99f4d5771862"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:45:08.907820 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.907781 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h5dkd\" (UniqueName: \"kubernetes.io/projected/4b939e50-5761-4424-a598-047b9627a397-kube-api-access-h5dkd\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:08.907820 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.907816 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ff500bc-4527-4d6d-9733-cb661b777ae1-util\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:08.907820 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.907827 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56911694-2580-4427-953f-99f4d5771862-util\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:08.908048 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.907836 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-znb7x\" (UniqueName: \"kubernetes.io/projected/2ff500bc-4527-4d6d-9733-cb661b777ae1-kube-api-access-znb7x\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:08.908048 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.907846 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:08.908048 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.907854 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8dae3ca-89b7-4da1-b2ce-754f4271850d-util\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:08.908048 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.907863 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cd4cn\" (UniqueName: \"kubernetes.io/projected/56911694-2580-4427-953f-99f4d5771862-kube-api-access-cd4cn\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:08.908048 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.907871 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b939e50-5761-4424-a598-047b9627a397-util\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:08.908048 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:08.907881 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-69775\" (UniqueName: \"kubernetes.io/projected/d8dae3ca-89b7-4da1-b2ce-754f4271850d-kube-api-access-69775\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:09.524208 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.524168 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" event={"ID":"2ff500bc-4527-4d6d-9733-cb661b777ae1","Type":"ContainerDied","Data":"1303d1f2656683824d3ebcfe4df9a287a83ea94767e7b0188dcc161ab5c18a35"} Apr 23 16:45:09.524208 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.524205 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1303d1f2656683824d3ebcfe4df9a287a83ea94767e7b0188dcc161ab5c18a35" Apr 23 16:45:09.524208 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.524206 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30rpz28" Apr 23 16:45:09.525850 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.525811 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" event={"ID":"d8dae3ca-89b7-4da1-b2ce-754f4271850d","Type":"ContainerDied","Data":"53a7d519fccfc339a37becc57b217e69cbaea4d1723d9e0ccfd7c014427f7ad6"} Apr 23 16:45:09.525850 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.525838 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88j9wkl" Apr 23 16:45:09.525850 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.525852 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53a7d519fccfc339a37becc57b217e69cbaea4d1723d9e0ccfd7c014427f7ad6" Apr 23 16:45:09.527494 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.527470 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" event={"ID":"4b939e50-5761-4424-a598-047b9627a397","Type":"ContainerDied","Data":"f9ee8104dac45f45a196250da4cf882f7b66ef1d6618b8e2126d0537c4a51663"} Apr 23 16:45:09.527620 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.527497 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9ee8104dac45f45a196250da4cf882f7b66ef1d6618b8e2126d0537c4a51663" Apr 23 16:45:09.527620 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.527515 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50357fch" Apr 23 16:45:09.529245 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.529222 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" event={"ID":"56911694-2580-4427-953f-99f4d5771862","Type":"ContainerDied","Data":"4d6b2d7958e4da594e805986ed9fb8c0174634048c124fa745b78ffda7c6cce3"} Apr 23 16:45:09.529245 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.529245 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d6b2d7958e4da594e805986ed9fb8c0174634048c124fa745b78ffda7c6cce3" Apr 23 16:45:09.529435 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:09.529332 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bljvhw" Apr 23 16:45:14.463212 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:14.463185 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:45:14.463653 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:14.463350 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:45:15.791092 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791054 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx"] Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791555 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b939e50-5761-4424-a598-047b9627a397" containerName="pull" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791573 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b939e50-5761-4424-a598-047b9627a397" containerName="pull" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791587 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56911694-2580-4427-953f-99f4d5771862" containerName="extract" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791592 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="56911694-2580-4427-953f-99f4d5771862" containerName="extract" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791599 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b939e50-5761-4424-a598-047b9627a397" containerName="util" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791605 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b939e50-5761-4424-a598-047b9627a397" containerName="util" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791611 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8dae3ca-89b7-4da1-b2ce-754f4271850d" containerName="pull" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791616 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8dae3ca-89b7-4da1-b2ce-754f4271850d" containerName="pull" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791625 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56911694-2580-4427-953f-99f4d5771862" containerName="pull" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791630 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="56911694-2580-4427-953f-99f4d5771862" containerName="pull" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791638 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b939e50-5761-4424-a598-047b9627a397" containerName="extract" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791644 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b939e50-5761-4424-a598-047b9627a397" containerName="extract" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791651 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ff500bc-4527-4d6d-9733-cb661b777ae1" containerName="pull" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791656 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff500bc-4527-4d6d-9733-cb661b777ae1" containerName="pull" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791661 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ff500bc-4527-4d6d-9733-cb661b777ae1" containerName="extract" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791665 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff500bc-4527-4d6d-9733-cb661b777ae1" containerName="extract" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791673 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56911694-2580-4427-953f-99f4d5771862" containerName="util" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791678 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="56911694-2580-4427-953f-99f4d5771862" containerName="util" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791686 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ff500bc-4527-4d6d-9733-cb661b777ae1" containerName="util" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791690 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff500bc-4527-4d6d-9733-cb661b777ae1" containerName="util" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791695 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8dae3ca-89b7-4da1-b2ce-754f4271850d" containerName="extract" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791700 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8dae3ca-89b7-4da1-b2ce-754f4271850d" containerName="extract" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791710 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8dae3ca-89b7-4da1-b2ce-754f4271850d" containerName="util" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791715 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8dae3ca-89b7-4da1-b2ce-754f4271850d" containerName="util" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791769 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="56911694-2580-4427-953f-99f4d5771862" containerName="extract" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791780 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ff500bc-4527-4d6d-9733-cb661b777ae1" containerName="extract" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791786 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b939e50-5761-4424-a598-047b9627a397" containerName="extract" Apr 23 16:45:15.793580 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.791793 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8dae3ca-89b7-4da1-b2ce-754f4271850d" containerName="extract" Apr 23 16:45:15.794646 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.794626 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" Apr 23 16:45:15.797891 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.797866 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-h59dm\"" Apr 23 16:45:15.798382 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.798358 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 16:45:15.798638 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.798610 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 16:45:15.806418 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.806338 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx"] Apr 23 16:45:15.866125 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.866090 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhljs\" (UniqueName: \"kubernetes.io/projected/6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2-kube-api-access-zhljs\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-mqgtx\" (UID: \"6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" Apr 23 16:45:15.866125 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.866128 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-mqgtx\" (UID: \"6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" Apr 23 16:45:15.967371 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.967329 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhljs\" (UniqueName: \"kubernetes.io/projected/6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2-kube-api-access-zhljs\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-mqgtx\" (UID: \"6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" Apr 23 16:45:15.967540 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.967386 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-mqgtx\" (UID: \"6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" Apr 23 16:45:15.967725 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.967705 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-mqgtx\" (UID: \"6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" Apr 23 16:45:15.978784 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:15.978752 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhljs\" (UniqueName: \"kubernetes.io/projected/6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2-kube-api-access-zhljs\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-mqgtx\" (UID: \"6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" Apr 23 16:45:16.110920 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:16.110834 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" Apr 23 16:45:16.260247 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:16.260207 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx"] Apr 23 16:45:16.265096 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:45:16.265070 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fc9d8b5_302d_46a4_acf1_8e2b2d75a8b2.slice/crio-ffaaa1b0f8dd961951164713250451be0971f726ac014ca6cbcbb96c0af18668 WatchSource:0}: Error finding container ffaaa1b0f8dd961951164713250451be0971f726ac014ca6cbcbb96c0af18668: Status 404 returned error can't find the container with id ffaaa1b0f8dd961951164713250451be0971f726ac014ca6cbcbb96c0af18668 Apr 23 16:45:16.555457 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:16.555419 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" event={"ID":"6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2","Type":"ContainerStarted","Data":"ffaaa1b0f8dd961951164713250451be0971f726ac014ca6cbcbb96c0af18668"} Apr 23 16:45:18.465334 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:18.465279 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-5xwxm"] Apr 23 16:45:18.470000 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:18.469974 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-5xwxm" Apr 23 16:45:18.473841 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:18.473730 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-kgn8p\"" Apr 23 16:45:18.479620 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:18.479591 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-5xwxm"] Apr 23 16:45:18.590360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:18.590316 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdnhz\" (UniqueName: \"kubernetes.io/projected/9d6557a0-dda5-43f0-ad59-78c18ad4d290-kube-api-access-zdnhz\") pod \"authorino-operator-7587b89b76-5xwxm\" (UID: \"9d6557a0-dda5-43f0-ad59-78c18ad4d290\") " pod="kuadrant-system/authorino-operator-7587b89b76-5xwxm" Apr 23 16:45:18.691904 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:18.691859 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdnhz\" (UniqueName: \"kubernetes.io/projected/9d6557a0-dda5-43f0-ad59-78c18ad4d290-kube-api-access-zdnhz\") pod \"authorino-operator-7587b89b76-5xwxm\" (UID: \"9d6557a0-dda5-43f0-ad59-78c18ad4d290\") " pod="kuadrant-system/authorino-operator-7587b89b76-5xwxm" Apr 23 16:45:18.701507 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:18.701469 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdnhz\" (UniqueName: \"kubernetes.io/projected/9d6557a0-dda5-43f0-ad59-78c18ad4d290-kube-api-access-zdnhz\") pod \"authorino-operator-7587b89b76-5xwxm\" (UID: \"9d6557a0-dda5-43f0-ad59-78c18ad4d290\") " pod="kuadrant-system/authorino-operator-7587b89b76-5xwxm" Apr 23 16:45:18.786233 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:18.786195 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-5xwxm" Apr 23 16:45:19.674335 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:19.674306 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-5xwxm"] Apr 23 16:45:19.676407 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:45:19.676376 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d6557a0_dda5_43f0_ad59_78c18ad4d290.slice/crio-ec6c76e2ac6ba7f44e807d1556f49a0ea2aec698e615b85922f4cf030962dfe2 WatchSource:0}: Error finding container ec6c76e2ac6ba7f44e807d1556f49a0ea2aec698e615b85922f4cf030962dfe2: Status 404 returned error can't find the container with id ec6c76e2ac6ba7f44e807d1556f49a0ea2aec698e615b85922f4cf030962dfe2 Apr 23 16:45:20.574575 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:20.574532 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-5xwxm" event={"ID":"9d6557a0-dda5-43f0-ad59-78c18ad4d290","Type":"ContainerStarted","Data":"ec6c76e2ac6ba7f44e807d1556f49a0ea2aec698e615b85922f4cf030962dfe2"} Apr 23 16:45:21.580979 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:21.580922 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" event={"ID":"6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2","Type":"ContainerStarted","Data":"84bb839d91131f28b4a7b6aa0a009e65c90623b43ce9fd48315db08ca65039b4"} Apr 23 16:45:21.581513 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:21.581188 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" Apr 23 16:45:21.613070 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:21.613011 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" podStartSLOduration=1.829162704 podStartE2EDuration="6.612994073s" podCreationTimestamp="2026-04-23 16:45:15 +0000 UTC" firstStartedPulling="2026-04-23 16:45:16.267385885 +0000 UTC m=+602.328460116" lastFinishedPulling="2026-04-23 16:45:21.051217246 +0000 UTC m=+607.112291485" observedRunningTime="2026-04-23 16:45:21.609568511 +0000 UTC m=+607.670642766" watchObservedRunningTime="2026-04-23 16:45:21.612994073 +0000 UTC m=+607.674068327" Apr 23 16:45:22.527178 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:22.527104 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66bd4b4847-f82xw" podUID="0a4f8871-af5b-4b90-bf35-070cc28d88e0" containerName="console" containerID="cri-o://b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623" gracePeriod=15 Apr 23 16:45:22.947649 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:22.947630 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66bd4b4847-f82xw_0a4f8871-af5b-4b90-bf35-070cc28d88e0/console/0.log" Apr 23 16:45:22.947916 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:22.947687 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:45:23.031830 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.031800 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-config\") pod \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " Apr 23 16:45:23.031993 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.031846 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8r9r\" (UniqueName: \"kubernetes.io/projected/0a4f8871-af5b-4b90-bf35-070cc28d88e0-kube-api-access-h8r9r\") pod \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " Apr 23 16:45:23.031993 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.031872 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-oauth-serving-cert\") pod \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " Apr 23 16:45:23.032106 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.032056 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-service-ca\") pod \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " Apr 23 16:45:23.032156 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.032138 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-serving-cert\") pod \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " Apr 23 16:45:23.032209 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.032173 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-oauth-config\") pod \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " Apr 23 16:45:23.032209 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.032183 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0a4f8871-af5b-4b90-bf35-070cc28d88e0" (UID: "0a4f8871-af5b-4b90-bf35-070cc28d88e0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:45:23.032333 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.032214 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-trusted-ca-bundle\") pod \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\" (UID: \"0a4f8871-af5b-4b90-bf35-070cc28d88e0\") " Apr 23 16:45:23.032538 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.032482 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-oauth-serving-cert\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:23.032648 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.032582 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-service-ca" (OuterVolumeSpecName: "service-ca") pod "0a4f8871-af5b-4b90-bf35-070cc28d88e0" (UID: "0a4f8871-af5b-4b90-bf35-070cc28d88e0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:45:23.032706 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.032659 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-config" (OuterVolumeSpecName: "console-config") pod "0a4f8871-af5b-4b90-bf35-070cc28d88e0" (UID: "0a4f8871-af5b-4b90-bf35-070cc28d88e0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:45:23.032756 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.032694 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0a4f8871-af5b-4b90-bf35-070cc28d88e0" (UID: "0a4f8871-af5b-4b90-bf35-070cc28d88e0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:45:23.034056 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.034028 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4f8871-af5b-4b90-bf35-070cc28d88e0-kube-api-access-h8r9r" (OuterVolumeSpecName: "kube-api-access-h8r9r") pod "0a4f8871-af5b-4b90-bf35-070cc28d88e0" (UID: "0a4f8871-af5b-4b90-bf35-070cc28d88e0"). InnerVolumeSpecName "kube-api-access-h8r9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:45:23.034163 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.034075 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0a4f8871-af5b-4b90-bf35-070cc28d88e0" (UID: "0a4f8871-af5b-4b90-bf35-070cc28d88e0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:45:23.034234 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.034220 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0a4f8871-af5b-4b90-bf35-070cc28d88e0" (UID: "0a4f8871-af5b-4b90-bf35-070cc28d88e0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:45:23.133747 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.133668 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-config\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:23.133747 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.133697 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8r9r\" (UniqueName: \"kubernetes.io/projected/0a4f8871-af5b-4b90-bf35-070cc28d88e0-kube-api-access-h8r9r\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:23.133747 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.133707 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-service-ca\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:23.133747 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.133717 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-serving-cert\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:23.133747 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.133725 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a4f8871-af5b-4b90-bf35-070cc28d88e0-console-oauth-config\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:23.133747 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.133733 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a4f8871-af5b-4b90-bf35-070cc28d88e0-trusted-ca-bundle\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:45:23.589464 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.589438 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66bd4b4847-f82xw_0a4f8871-af5b-4b90-bf35-070cc28d88e0/console/0.log" Apr 23 16:45:23.589634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.589477 2580 generic.go:358] "Generic (PLEG): container finished" podID="0a4f8871-af5b-4b90-bf35-070cc28d88e0" containerID="b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623" exitCode=2 Apr 23 16:45:23.589634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.589536 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66bd4b4847-f82xw" event={"ID":"0a4f8871-af5b-4b90-bf35-070cc28d88e0","Type":"ContainerDied","Data":"b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623"} Apr 23 16:45:23.589634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.589542 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66bd4b4847-f82xw" Apr 23 16:45:23.589634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.589559 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66bd4b4847-f82xw" event={"ID":"0a4f8871-af5b-4b90-bf35-070cc28d88e0","Type":"ContainerDied","Data":"cf47cb25b781aad2c97f5c4002d86a5d0dee577175571075752cc15633adda02"} Apr 23 16:45:23.589634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.589575 2580 scope.go:117] "RemoveContainer" containerID="b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623" Apr 23 16:45:23.591168 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.591144 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-5xwxm" event={"ID":"9d6557a0-dda5-43f0-ad59-78c18ad4d290","Type":"ContainerStarted","Data":"abc9a162331942da61235af2fa826fdc798896bb66a681beacd04bc2901272ee"} Apr 23 16:45:23.591329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.591311 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-5xwxm" Apr 23 16:45:23.598394 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.598373 2580 scope.go:117] "RemoveContainer" containerID="b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623" Apr 23 16:45:23.598657 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:45:23.598638 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623\": container with ID starting with b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623 not found: ID does not exist" containerID="b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623" Apr 23 16:45:23.598727 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.598669 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623"} err="failed to get container status \"b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623\": rpc error: code = NotFound desc = could not find container \"b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623\": container with ID starting with b5d89802c8934db76a9a0a21cd87bb334348e21559c21c0b7c0f5e22920f4623 not found: ID does not exist" Apr 23 16:45:23.618259 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.618214 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-5xwxm" podStartSLOduration=2.424348582 podStartE2EDuration="5.618202817s" podCreationTimestamp="2026-04-23 16:45:18 +0000 UTC" firstStartedPulling="2026-04-23 16:45:19.679376143 +0000 UTC m=+605.740450413" lastFinishedPulling="2026-04-23 16:45:22.873230416 +0000 UTC m=+608.934304648" observedRunningTime="2026-04-23 16:45:23.615590203 +0000 UTC m=+609.676664469" watchObservedRunningTime="2026-04-23 16:45:23.618202817 +0000 UTC m=+609.679277115" Apr 23 16:45:23.644559 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.644527 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66bd4b4847-f82xw"] Apr 23 16:45:23.649612 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:23.649584 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66bd4b4847-f82xw"] Apr 23 16:45:24.537916 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:24.537882 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a4f8871-af5b-4b90-bf35-070cc28d88e0" path="/var/lib/kubelet/pods/0a4f8871-af5b-4b90-bf35-070cc28d88e0/volumes" Apr 23 16:45:32.587611 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:32.587528 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-mqgtx" Apr 23 16:45:34.597734 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:45:34.597704 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-5xwxm" Apr 23 16:46:09.248265 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.248227 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-9jtks"] Apr 23 16:46:09.248821 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.248780 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a4f8871-af5b-4b90-bf35-070cc28d88e0" containerName="console" Apr 23 16:46:09.248821 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.248801 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4f8871-af5b-4b90-bf35-070cc28d88e0" containerName="console" Apr 23 16:46:09.248942 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.248899 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a4f8871-af5b-4b90-bf35-070cc28d88e0" containerName="console" Apr 23 16:46:09.251891 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.251869 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" Apr 23 16:46:09.254768 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.254745 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 23 16:46:09.254768 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.254762 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-xz7jf\"" Apr 23 16:46:09.260593 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.260567 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-9jtks"] Apr 23 16:46:09.294377 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.294345 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-9jtks"] Apr 23 16:46:09.306387 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.306355 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbmwp\" (UniqueName: \"kubernetes.io/projected/b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a-kube-api-access-hbmwp\") pod \"limitador-limitador-67566c68b4-9jtks\" (UID: \"b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" Apr 23 16:46:09.306508 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.306391 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a-config-file\") pod \"limitador-limitador-67566c68b4-9jtks\" (UID: \"b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" Apr 23 16:46:09.407337 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.407301 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbmwp\" (UniqueName: \"kubernetes.io/projected/b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a-kube-api-access-hbmwp\") pod \"limitador-limitador-67566c68b4-9jtks\" (UID: \"b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" Apr 23 16:46:09.407337 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.407341 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a-config-file\") pod \"limitador-limitador-67566c68b4-9jtks\" (UID: \"b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" Apr 23 16:46:09.407969 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.407949 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a-config-file\") pod \"limitador-limitador-67566c68b4-9jtks\" (UID: \"b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" Apr 23 16:46:09.418841 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.418811 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbmwp\" (UniqueName: \"kubernetes.io/projected/b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a-kube-api-access-hbmwp\") pod \"limitador-limitador-67566c68b4-9jtks\" (UID: \"b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" Apr 23 16:46:09.562849 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.562763 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" Apr 23 16:46:09.690944 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.690914 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-9jtks"] Apr 23 16:46:09.693613 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:46:09.693580 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1a52c1c_80a1_4ed7_9fb4_6529bd1c776a.slice/crio-bbf53c05254328106743a25e52d43040b0dc865b67a89581ca5cd3491d9f742e WatchSource:0}: Error finding container bbf53c05254328106743a25e52d43040b0dc865b67a89581ca5cd3491d9f742e: Status 404 returned error can't find the container with id bbf53c05254328106743a25e52d43040b0dc865b67a89581ca5cd3491d9f742e Apr 23 16:46:09.695996 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.695977 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:46:09.768932 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:09.768895 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" event={"ID":"b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a","Type":"ContainerStarted","Data":"bbf53c05254328106743a25e52d43040b0dc865b67a89581ca5cd3491d9f742e"} Apr 23 16:46:13.786914 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:13.786870 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" event={"ID":"b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a","Type":"ContainerStarted","Data":"1862d5121e1fc1b6b8a5b40bc168153232304abb5560ef8d785e26f251ffe87f"} Apr 23 16:46:13.787395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:13.786936 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" Apr 23 16:46:13.803791 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:13.803747 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" podStartSLOduration=1.136743844 podStartE2EDuration="4.803732884s" podCreationTimestamp="2026-04-23 16:46:09 +0000 UTC" firstStartedPulling="2026-04-23 16:46:09.696105074 +0000 UTC m=+655.757179306" lastFinishedPulling="2026-04-23 16:46:13.36309411 +0000 UTC m=+659.424168346" observedRunningTime="2026-04-23 16:46:13.801620216 +0000 UTC m=+659.862694469" watchObservedRunningTime="2026-04-23 16:46:13.803732884 +0000 UTC m=+659.864807138" Apr 23 16:46:24.792275 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:24.792243 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-9jtks" Apr 23 16:46:47.777462 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:47.777424 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc"] Apr 23 16:46:47.777928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:47.777667 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" podUID="16ea3634-70d5-4dbb-b92a-56a222a8bebe" containerName="discovery" containerID="cri-o://63d51fdfa0100c66432086320efe078843a9f7a76f15ba817b62d45a45099aeb" gracePeriod=30 Apr 23 16:46:47.927104 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:47.927071 2580 generic.go:358] "Generic (PLEG): container finished" podID="16ea3634-70d5-4dbb-b92a-56a222a8bebe" containerID="63d51fdfa0100c66432086320efe078843a9f7a76f15ba817b62d45a45099aeb" exitCode=0 Apr 23 16:46:47.927280 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:47.927111 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" event={"ID":"16ea3634-70d5-4dbb-b92a-56a222a8bebe","Type":"ContainerDied","Data":"63d51fdfa0100c66432086320efe078843a9f7a76f15ba817b62d45a45099aeb"} Apr 23 16:46:48.043567 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.043538 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:46:48.135577 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.135538 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-ca-configmap\") pod \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " Apr 23 16:46:48.135737 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.135585 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/16ea3634-70d5-4dbb-b92a-56a222a8bebe-local-certs\") pod \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " Apr 23 16:46:48.135737 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.135618 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-dns-cert\") pod \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " Apr 23 16:46:48.135737 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.135669 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-token\") pod \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " Apr 23 16:46:48.135737 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.135700 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwvzb\" (UniqueName: \"kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-kube-api-access-gwvzb\") pod \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " Apr 23 16:46:48.135737 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.135728 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-kubeconfig\") pod \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " Apr 23 16:46:48.135995 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.135771 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-cacerts\") pod \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\" (UID: \"16ea3634-70d5-4dbb-b92a-56a222a8bebe\") " Apr 23 16:46:48.135995 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.135963 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "16ea3634-70d5-4dbb-b92a-56a222a8bebe" (UID: "16ea3634-70d5-4dbb-b92a-56a222a8bebe"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:46:48.138197 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.138161 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-token" (OuterVolumeSpecName: "istio-token") pod "16ea3634-70d5-4dbb-b92a-56a222a8bebe" (UID: "16ea3634-70d5-4dbb-b92a-56a222a8bebe"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:46:48.138331 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.138283 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-kube-api-access-gwvzb" (OuterVolumeSpecName: "kube-api-access-gwvzb") pod "16ea3634-70d5-4dbb-b92a-56a222a8bebe" (UID: "16ea3634-70d5-4dbb-b92a-56a222a8bebe"). InnerVolumeSpecName "kube-api-access-gwvzb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:46:48.138405 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.138327 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "16ea3634-70d5-4dbb-b92a-56a222a8bebe" (UID: "16ea3634-70d5-4dbb-b92a-56a222a8bebe"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:46:48.138405 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.138366 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-cacerts" (OuterVolumeSpecName: "cacerts") pod "16ea3634-70d5-4dbb-b92a-56a222a8bebe" (UID: "16ea3634-70d5-4dbb-b92a-56a222a8bebe"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:46:48.138513 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.138474 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "16ea3634-70d5-4dbb-b92a-56a222a8bebe" (UID: "16ea3634-70d5-4dbb-b92a-56a222a8bebe"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:46:48.138654 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.138631 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ea3634-70d5-4dbb-b92a-56a222a8bebe-local-certs" (OuterVolumeSpecName: "local-certs") pod "16ea3634-70d5-4dbb-b92a-56a222a8bebe" (UID: "16ea3634-70d5-4dbb-b92a-56a222a8bebe"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:46:48.236430 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.236397 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gwvzb\" (UniqueName: \"kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-kube-api-access-gwvzb\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:46:48.236430 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.236428 2580 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-kubeconfig\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:46:48.236430 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.236438 2580 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-cacerts\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:46:48.236677 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.236447 2580 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-ca-configmap\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:46:48.236677 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.236457 2580 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/16ea3634-70d5-4dbb-b92a-56a222a8bebe-local-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:46:48.236677 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.236470 2580 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-csr-dns-cert\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:46:48.236677 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.236481 2580 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16ea3634-70d5-4dbb-b92a-56a222a8bebe-istio-token\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:46:48.932430 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.932345 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" Apr 23 16:46:48.932430 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.932347 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc" event={"ID":"16ea3634-70d5-4dbb-b92a-56a222a8bebe","Type":"ContainerDied","Data":"bcb87fb54d68f80e11a0d7af9f0f8e37d456eedfb1b11e58b2a95cb418d8a434"} Apr 23 16:46:48.932907 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.932460 2580 scope.go:117] "RemoveContainer" containerID="63d51fdfa0100c66432086320efe078843a9f7a76f15ba817b62d45a45099aeb" Apr 23 16:46:48.952380 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.952354 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc"] Apr 23 16:46:48.961451 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:48.961426 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-mppdc"] Apr 23 16:46:50.537708 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:50.537672 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ea3634-70d5-4dbb-b92a-56a222a8bebe" path="/var/lib/kubelet/pods/16ea3634-70d5-4dbb-b92a-56a222a8bebe/volumes" Apr 23 16:46:52.600173 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.600138 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-jdd7t"] Apr 23 16:46:52.600663 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.600499 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16ea3634-70d5-4dbb-b92a-56a222a8bebe" containerName="discovery" Apr 23 16:46:52.600663 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.600511 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ea3634-70d5-4dbb-b92a-56a222a8bebe" containerName="discovery" Apr 23 16:46:52.600663 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.600573 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="16ea3634-70d5-4dbb-b92a-56a222a8bebe" containerName="discovery" Apr 23 16:46:52.605059 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.605034 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:46:52.607949 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.607693 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 16:46:52.607949 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.607696 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 16:46:52.608934 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.608915 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 16:46:52.609083 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.609023 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-z5x5j\"" Apr 23 16:46:52.611589 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.611568 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-jdd7t"] Apr 23 16:46:52.617397 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.617374 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5"] Apr 23 16:46:52.620847 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.620831 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:46:52.623560 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.623539 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 16:46:52.623666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.623547 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-mv4mn\"" Apr 23 16:46:52.634760 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.634737 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5"] Apr 23 16:46:52.640960 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.640933 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-rkbz5"] Apr 23 16:46:52.644686 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.644666 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-rkbz5" Apr 23 16:46:52.647899 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.647880 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 16:46:52.648004 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.647953 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-gsmdv\"" Apr 23 16:46:52.652484 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.652459 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-rkbz5"] Apr 23 16:46:52.676353 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.676324 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-cert\") pod \"llmisvc-controller-manager-6cf46f7d78-p6tn5\" (UID: \"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f\") " pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:46:52.676496 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.676371 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h572h\" (UniqueName: \"kubernetes.io/projected/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-kube-api-access-h572h\") pod \"llmisvc-controller-manager-6cf46f7d78-p6tn5\" (UID: \"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f\") " pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:46:52.777615 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.777581 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1d6b432c-ebcd-4f32-8197-5a410b0ef10c-data\") pod \"seaweedfs-86cc847c5c-rkbz5\" (UID: \"1d6b432c-ebcd-4f32-8197-5a410b0ef10c\") " pod="kserve/seaweedfs-86cc847c5c-rkbz5" Apr 23 16:46:52.777797 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.777629 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj29j\" (UniqueName: \"kubernetes.io/projected/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-kube-api-access-zj29j\") pod \"kserve-controller-manager-5b898d7b9d-jdd7t\" (UID: \"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c\") " pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:46:52.777797 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.777725 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mddq\" (UniqueName: \"kubernetes.io/projected/1d6b432c-ebcd-4f32-8197-5a410b0ef10c-kube-api-access-7mddq\") pod \"seaweedfs-86cc847c5c-rkbz5\" (UID: \"1d6b432c-ebcd-4f32-8197-5a410b0ef10c\") " pod="kserve/seaweedfs-86cc847c5c-rkbz5" Apr 23 16:46:52.777797 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.777757 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-cert\") pod \"llmisvc-controller-manager-6cf46f7d78-p6tn5\" (UID: \"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f\") " pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:46:52.777797 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.777779 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-cert\") pod \"kserve-controller-manager-5b898d7b9d-jdd7t\" (UID: \"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c\") " pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:46:52.778015 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.777811 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h572h\" (UniqueName: \"kubernetes.io/projected/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-kube-api-access-h572h\") pod \"llmisvc-controller-manager-6cf46f7d78-p6tn5\" (UID: \"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f\") " pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:46:52.780282 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.780252 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-cert\") pod \"llmisvc-controller-manager-6cf46f7d78-p6tn5\" (UID: \"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f\") " pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:46:52.786878 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.786849 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h572h\" (UniqueName: \"kubernetes.io/projected/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-kube-api-access-h572h\") pod \"llmisvc-controller-manager-6cf46f7d78-p6tn5\" (UID: \"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f\") " pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:46:52.878810 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.878724 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mddq\" (UniqueName: \"kubernetes.io/projected/1d6b432c-ebcd-4f32-8197-5a410b0ef10c-kube-api-access-7mddq\") pod \"seaweedfs-86cc847c5c-rkbz5\" (UID: \"1d6b432c-ebcd-4f32-8197-5a410b0ef10c\") " pod="kserve/seaweedfs-86cc847c5c-rkbz5" Apr 23 16:46:52.878810 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.878767 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-cert\") pod \"kserve-controller-manager-5b898d7b9d-jdd7t\" (UID: \"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c\") " pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:46:52.878810 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.878804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1d6b432c-ebcd-4f32-8197-5a410b0ef10c-data\") pod \"seaweedfs-86cc847c5c-rkbz5\" (UID: \"1d6b432c-ebcd-4f32-8197-5a410b0ef10c\") " pod="kserve/seaweedfs-86cc847c5c-rkbz5" Apr 23 16:46:52.879029 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.878828 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj29j\" (UniqueName: \"kubernetes.io/projected/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-kube-api-access-zj29j\") pod \"kserve-controller-manager-5b898d7b9d-jdd7t\" (UID: \"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c\") " pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:46:52.879264 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.879244 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1d6b432c-ebcd-4f32-8197-5a410b0ef10c-data\") pod \"seaweedfs-86cc847c5c-rkbz5\" (UID: \"1d6b432c-ebcd-4f32-8197-5a410b0ef10c\") " pod="kserve/seaweedfs-86cc847c5c-rkbz5" Apr 23 16:46:52.881202 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.881181 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-cert\") pod \"kserve-controller-manager-5b898d7b9d-jdd7t\" (UID: \"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c\") " pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:46:52.888009 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.887982 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mddq\" (UniqueName: \"kubernetes.io/projected/1d6b432c-ebcd-4f32-8197-5a410b0ef10c-kube-api-access-7mddq\") pod \"seaweedfs-86cc847c5c-rkbz5\" (UID: \"1d6b432c-ebcd-4f32-8197-5a410b0ef10c\") " pod="kserve/seaweedfs-86cc847c5c-rkbz5" Apr 23 16:46:52.888134 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.888112 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj29j\" (UniqueName: \"kubernetes.io/projected/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-kube-api-access-zj29j\") pod \"kserve-controller-manager-5b898d7b9d-jdd7t\" (UID: \"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c\") " pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:46:52.917792 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.917771 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:46:52.933487 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.933457 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:46:52.958681 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:52.958645 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-rkbz5" Apr 23 16:46:53.066841 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:53.066812 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-jdd7t"] Apr 23 16:46:53.069779 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:46:53.069748 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeb6dc1a_18b1_4b35_a6e0_c3807f3c301c.slice/crio-40201ee45003fa97411c4499ca25c3ad46d73892571cb2d8e0417e2fe90641cc WatchSource:0}: Error finding container 40201ee45003fa97411c4499ca25c3ad46d73892571cb2d8e0417e2fe90641cc: Status 404 returned error can't find the container with id 40201ee45003fa97411c4499ca25c3ad46d73892571cb2d8e0417e2fe90641cc Apr 23 16:46:53.089663 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:53.089634 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5"] Apr 23 16:46:53.090926 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:46:53.090876 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddfe094ff_b04d_4c9e_9dcc_cfe7279e071f.slice/crio-1f1d6316202e29db5ce7654b8dca6a134d981da15a99f04865576d942df06449 WatchSource:0}: Error finding container 1f1d6316202e29db5ce7654b8dca6a134d981da15a99f04865576d942df06449: Status 404 returned error can't find the container with id 1f1d6316202e29db5ce7654b8dca6a134d981da15a99f04865576d942df06449 Apr 23 16:46:53.119188 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:53.119159 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-rkbz5"] Apr 23 16:46:53.120940 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:46:53.120913 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d6b432c_ebcd_4f32_8197_5a410b0ef10c.slice/crio-33e908ee20893ca1bdb62519ae97023ab8efa20069b54ec9b20896c2cb8691c4 WatchSource:0}: Error finding container 33e908ee20893ca1bdb62519ae97023ab8efa20069b54ec9b20896c2cb8691c4: Status 404 returned error can't find the container with id 33e908ee20893ca1bdb62519ae97023ab8efa20069b54ec9b20896c2cb8691c4 Apr 23 16:46:53.961147 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:53.960266 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-rkbz5" event={"ID":"1d6b432c-ebcd-4f32-8197-5a410b0ef10c","Type":"ContainerStarted","Data":"33e908ee20893ca1bdb62519ae97023ab8efa20069b54ec9b20896c2cb8691c4"} Apr 23 16:46:53.962733 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:53.962699 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" event={"ID":"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f","Type":"ContainerStarted","Data":"1f1d6316202e29db5ce7654b8dca6a134d981da15a99f04865576d942df06449"} Apr 23 16:46:53.965232 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:53.965203 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" event={"ID":"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c","Type":"ContainerStarted","Data":"40201ee45003fa97411c4499ca25c3ad46d73892571cb2d8e0417e2fe90641cc"} Apr 23 16:46:57.988328 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:57.988197 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" event={"ID":"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f","Type":"ContainerStarted","Data":"ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe"} Apr 23 16:46:57.988781 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:57.988333 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:46:57.989655 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:57.989634 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" event={"ID":"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c","Type":"ContainerStarted","Data":"304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421"} Apr 23 16:46:57.989768 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:57.989680 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:46:57.990955 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:57.990930 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-rkbz5" event={"ID":"1d6b432c-ebcd-4f32-8197-5a410b0ef10c","Type":"ContainerStarted","Data":"b869cb783aee881bf881a2d6145bd7064030b89be6740364f9a757f8d47af812"} Apr 23 16:46:57.991096 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:57.991081 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-rkbz5" Apr 23 16:46:58.006540 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:58.006497 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" podStartSLOduration=1.403975407 podStartE2EDuration="6.006484429s" podCreationTimestamp="2026-04-23 16:46:52 +0000 UTC" firstStartedPulling="2026-04-23 16:46:53.092733237 +0000 UTC m=+699.153807469" lastFinishedPulling="2026-04-23 16:46:57.695242258 +0000 UTC m=+703.756316491" observedRunningTime="2026-04-23 16:46:58.003870649 +0000 UTC m=+704.064944902" watchObservedRunningTime="2026-04-23 16:46:58.006484429 +0000 UTC m=+704.067558683" Apr 23 16:46:58.030880 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:58.030796 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" podStartSLOduration=1.486756177 podStartE2EDuration="6.030783062s" podCreationTimestamp="2026-04-23 16:46:52 +0000 UTC" firstStartedPulling="2026-04-23 16:46:53.071499015 +0000 UTC m=+699.132573247" lastFinishedPulling="2026-04-23 16:46:57.615525901 +0000 UTC m=+703.676600132" observedRunningTime="2026-04-23 16:46:58.027613569 +0000 UTC m=+704.088687836" watchObservedRunningTime="2026-04-23 16:46:58.030783062 +0000 UTC m=+704.091857370" Apr 23 16:46:58.064210 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:46:58.064064 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-rkbz5" podStartSLOduration=1.430916932 podStartE2EDuration="6.064052072s" podCreationTimestamp="2026-04-23 16:46:52 +0000 UTC" firstStartedPulling="2026-04-23 16:46:53.122232263 +0000 UTC m=+699.183306495" lastFinishedPulling="2026-04-23 16:46:57.755367399 +0000 UTC m=+703.816441635" observedRunningTime="2026-04-23 16:46:58.063862789 +0000 UTC m=+704.124937055" watchObservedRunningTime="2026-04-23 16:46:58.064052072 +0000 UTC m=+704.125126328" Apr 23 16:47:03.997468 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:03.997436 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-rkbz5" Apr 23 16:47:28.997418 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:28.997384 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:47:29.000348 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:29.000330 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:47:30.490187 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.490153 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-jdd7t"] Apr 23 16:47:30.490662 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.490396 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" podUID="eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c" containerName="manager" containerID="cri-o://304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421" gracePeriod=10 Apr 23 16:47:30.515255 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.515224 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-8z97z"] Apr 23 16:47:30.602776 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.602747 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" Apr 23 16:47:30.608145 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.608122 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-8z97z"] Apr 23 16:47:30.692842 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.692808 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b576b0bb-3d65-4229-8396-0f196cdbd516-cert\") pod \"kserve-controller-manager-5b898d7b9d-8z97z\" (UID: \"b576b0bb-3d65-4229-8396-0f196cdbd516\") " pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" Apr 23 16:47:30.692984 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.692870 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhjxg\" (UniqueName: \"kubernetes.io/projected/b576b0bb-3d65-4229-8396-0f196cdbd516-kube-api-access-mhjxg\") pod \"kserve-controller-manager-5b898d7b9d-8z97z\" (UID: \"b576b0bb-3d65-4229-8396-0f196cdbd516\") " pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" Apr 23 16:47:30.748977 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.748919 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:47:30.793880 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.793844 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjxg\" (UniqueName: \"kubernetes.io/projected/b576b0bb-3d65-4229-8396-0f196cdbd516-kube-api-access-mhjxg\") pod \"kserve-controller-manager-5b898d7b9d-8z97z\" (UID: \"b576b0bb-3d65-4229-8396-0f196cdbd516\") " pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" Apr 23 16:47:30.794056 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.793931 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b576b0bb-3d65-4229-8396-0f196cdbd516-cert\") pod \"kserve-controller-manager-5b898d7b9d-8z97z\" (UID: \"b576b0bb-3d65-4229-8396-0f196cdbd516\") " pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" Apr 23 16:47:30.796399 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.796374 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b576b0bb-3d65-4229-8396-0f196cdbd516-cert\") pod \"kserve-controller-manager-5b898d7b9d-8z97z\" (UID: \"b576b0bb-3d65-4229-8396-0f196cdbd516\") " pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" Apr 23 16:47:30.802329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.802306 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhjxg\" (UniqueName: \"kubernetes.io/projected/b576b0bb-3d65-4229-8396-0f196cdbd516-kube-api-access-mhjxg\") pod \"kserve-controller-manager-5b898d7b9d-8z97z\" (UID: \"b576b0bb-3d65-4229-8396-0f196cdbd516\") " pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" Apr 23 16:47:30.894881 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.894843 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-cert\") pod \"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c\" (UID: \"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c\") " Apr 23 16:47:30.895073 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.894918 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj29j\" (UniqueName: \"kubernetes.io/projected/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-kube-api-access-zj29j\") pod \"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c\" (UID: \"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c\") " Apr 23 16:47:30.897071 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.897039 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-cert" (OuterVolumeSpecName: "cert") pod "eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c" (UID: "eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:47:30.897071 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.897039 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-kube-api-access-zj29j" (OuterVolumeSpecName: "kube-api-access-zj29j") pod "eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c" (UID: "eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c"). InnerVolumeSpecName "kube-api-access-zj29j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:47:30.962781 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.962740 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" Apr 23 16:47:30.995896 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.995865 2580 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-cert\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:47:30.995896 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:30.995895 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zj29j\" (UniqueName: \"kubernetes.io/projected/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c-kube-api-access-zj29j\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:47:31.085925 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:31.085896 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-8z97z"] Apr 23 16:47:31.087667 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:47:31.087637 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb576b0bb_3d65_4229_8396_0f196cdbd516.slice/crio-81877c8247752af3f7409b9c65fa700b4c89481a895a3842f836826e66578d8b WatchSource:0}: Error finding container 81877c8247752af3f7409b9c65fa700b4c89481a895a3842f836826e66578d8b: Status 404 returned error can't find the container with id 81877c8247752af3f7409b9c65fa700b4c89481a895a3842f836826e66578d8b Apr 23 16:47:31.124829 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:31.124793 2580 generic.go:358] "Generic (PLEG): container finished" podID="eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c" containerID="304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421" exitCode=0 Apr 23 16:47:31.124988 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:31.124827 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" event={"ID":"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c","Type":"ContainerDied","Data":"304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421"} Apr 23 16:47:31.124988 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:31.124856 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" Apr 23 16:47:31.124988 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:31.124870 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-jdd7t" event={"ID":"eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c","Type":"ContainerDied","Data":"40201ee45003fa97411c4499ca25c3ad46d73892571cb2d8e0417e2fe90641cc"} Apr 23 16:47:31.124988 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:31.124887 2580 scope.go:117] "RemoveContainer" containerID="304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421" Apr 23 16:47:31.126115 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:31.126090 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" event={"ID":"b576b0bb-3d65-4229-8396-0f196cdbd516","Type":"ContainerStarted","Data":"81877c8247752af3f7409b9c65fa700b4c89481a895a3842f836826e66578d8b"} Apr 23 16:47:31.134266 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:31.134250 2580 scope.go:117] "RemoveContainer" containerID="304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421" Apr 23 16:47:31.134525 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:47:31.134508 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421\": container with ID starting with 304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421 not found: ID does not exist" containerID="304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421" Apr 23 16:47:31.134581 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:31.134532 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421"} err="failed to get container status \"304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421\": rpc error: code = NotFound desc = could not find container \"304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421\": container with ID starting with 304b6dc2d39fb02fcf20dd9643c7ea067bcc5ea1f9fedde6ff97a3143bfc0421 not found: ID does not exist" Apr 23 16:47:31.146757 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:31.146727 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-jdd7t"] Apr 23 16:47:31.149094 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:31.149074 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-jdd7t"] Apr 23 16:47:32.131935 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:32.131893 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" event={"ID":"b576b0bb-3d65-4229-8396-0f196cdbd516","Type":"ContainerStarted","Data":"c8772f0e22828d8ad02a467d59b848274163207b65da3316d36a394d0da439f7"} Apr 23 16:47:32.132339 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:32.132038 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" Apr 23 16:47:32.149382 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:32.149332 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" podStartSLOduration=1.572860659 podStartE2EDuration="2.149317976s" podCreationTimestamp="2026-04-23 16:47:30 +0000 UTC" firstStartedPulling="2026-04-23 16:47:31.088923433 +0000 UTC m=+737.149997668" lastFinishedPulling="2026-04-23 16:47:31.665380738 +0000 UTC m=+737.726454985" observedRunningTime="2026-04-23 16:47:32.146874746 +0000 UTC m=+738.207948997" watchObservedRunningTime="2026-04-23 16:47:32.149317976 +0000 UTC m=+738.210392275" Apr 23 16:47:32.537877 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:47:32.537842 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c" path="/var/lib/kubelet/pods/eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c/volumes" Apr 23 16:48:03.140418 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:03.140384 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-5b898d7b9d-8z97z" Apr 23 16:48:04.096441 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.096402 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-xktt4"] Apr 23 16:48:04.096876 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.096857 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c" containerName="manager" Apr 23 16:48:04.096982 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.096878 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c" containerName="manager" Apr 23 16:48:04.096982 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.096961 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="eeb6dc1a-18b1-4b35-a6e0-c3807f3c301c" containerName="manager" Apr 23 16:48:04.099070 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.099047 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-xktt4" Apr 23 16:48:04.101780 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.101756 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-wf7wz\"" Apr 23 16:48:04.101894 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.101761 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 23 16:48:04.109540 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.109516 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-xktt4"] Apr 23 16:48:04.115694 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.115663 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-5kjzr"] Apr 23 16:48:04.118362 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.118342 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-5kjzr" Apr 23 16:48:04.120907 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.120889 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 23 16:48:04.121365 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.121345 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-mj8rx\"" Apr 23 16:48:04.128674 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.128651 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-5kjzr"] Apr 23 16:48:04.174578 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.174546 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btffm\" (UniqueName: \"kubernetes.io/projected/bf739949-4ce7-4c9b-a0c6-5a25705080da-kube-api-access-btffm\") pod \"odh-model-controller-696fc77849-5kjzr\" (UID: \"bf739949-4ce7-4c9b-a0c6-5a25705080da\") " pod="kserve/odh-model-controller-696fc77849-5kjzr" Apr 23 16:48:04.174957 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.174594 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf-tls-certs\") pod \"model-serving-api-86f7b4b499-xktt4\" (UID: \"897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf\") " pod="kserve/model-serving-api-86f7b4b499-xktt4" Apr 23 16:48:04.174957 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.174701 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf739949-4ce7-4c9b-a0c6-5a25705080da-cert\") pod \"odh-model-controller-696fc77849-5kjzr\" (UID: \"bf739949-4ce7-4c9b-a0c6-5a25705080da\") " pod="kserve/odh-model-controller-696fc77849-5kjzr" Apr 23 16:48:04.174957 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.174741 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6grl\" (UniqueName: \"kubernetes.io/projected/897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf-kube-api-access-z6grl\") pod \"model-serving-api-86f7b4b499-xktt4\" (UID: \"897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf\") " pod="kserve/model-serving-api-86f7b4b499-xktt4" Apr 23 16:48:04.275365 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.275323 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btffm\" (UniqueName: \"kubernetes.io/projected/bf739949-4ce7-4c9b-a0c6-5a25705080da-kube-api-access-btffm\") pod \"odh-model-controller-696fc77849-5kjzr\" (UID: \"bf739949-4ce7-4c9b-a0c6-5a25705080da\") " pod="kserve/odh-model-controller-696fc77849-5kjzr" Apr 23 16:48:04.275549 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.275384 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf-tls-certs\") pod \"model-serving-api-86f7b4b499-xktt4\" (UID: \"897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf\") " pod="kserve/model-serving-api-86f7b4b499-xktt4" Apr 23 16:48:04.275549 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.275463 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf739949-4ce7-4c9b-a0c6-5a25705080da-cert\") pod \"odh-model-controller-696fc77849-5kjzr\" (UID: \"bf739949-4ce7-4c9b-a0c6-5a25705080da\") " pod="kserve/odh-model-controller-696fc77849-5kjzr" Apr 23 16:48:04.275549 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.275499 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6grl\" (UniqueName: \"kubernetes.io/projected/897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf-kube-api-access-z6grl\") pod \"model-serving-api-86f7b4b499-xktt4\" (UID: \"897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf\") " pod="kserve/model-serving-api-86f7b4b499-xktt4" Apr 23 16:48:04.275760 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:48:04.275609 2580 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 23 16:48:04.275760 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:48:04.275690 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf739949-4ce7-4c9b-a0c6-5a25705080da-cert podName:bf739949-4ce7-4c9b-a0c6-5a25705080da nodeName:}" failed. No retries permitted until 2026-04-23 16:48:04.775669694 +0000 UTC m=+770.836743934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf739949-4ce7-4c9b-a0c6-5a25705080da-cert") pod "odh-model-controller-696fc77849-5kjzr" (UID: "bf739949-4ce7-4c9b-a0c6-5a25705080da") : secret "odh-model-controller-webhook-cert" not found Apr 23 16:48:04.278082 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.278058 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf-tls-certs\") pod \"model-serving-api-86f7b4b499-xktt4\" (UID: \"897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf\") " pod="kserve/model-serving-api-86f7b4b499-xktt4" Apr 23 16:48:04.288045 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.288013 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6grl\" (UniqueName: \"kubernetes.io/projected/897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf-kube-api-access-z6grl\") pod \"model-serving-api-86f7b4b499-xktt4\" (UID: \"897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf\") " pod="kserve/model-serving-api-86f7b4b499-xktt4" Apr 23 16:48:04.288176 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.288160 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btffm\" (UniqueName: \"kubernetes.io/projected/bf739949-4ce7-4c9b-a0c6-5a25705080da-kube-api-access-btffm\") pod \"odh-model-controller-696fc77849-5kjzr\" (UID: \"bf739949-4ce7-4c9b-a0c6-5a25705080da\") " pod="kserve/odh-model-controller-696fc77849-5kjzr" Apr 23 16:48:04.411032 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.410949 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-xktt4" Apr 23 16:48:04.538780 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:48:04.538751 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod897c34ed_98e1_4d45_85a6_ca0ce1c2c2bf.slice/crio-7f1e2ade3ff06fa0f1be83b38f996628d941429c266901ce1f75533e8a497905 WatchSource:0}: Error finding container 7f1e2ade3ff06fa0f1be83b38f996628d941429c266901ce1f75533e8a497905: Status 404 returned error can't find the container with id 7f1e2ade3ff06fa0f1be83b38f996628d941429c266901ce1f75533e8a497905 Apr 23 16:48:04.539563 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.539537 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-xktt4"] Apr 23 16:48:04.780162 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.780123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf739949-4ce7-4c9b-a0c6-5a25705080da-cert\") pod \"odh-model-controller-696fc77849-5kjzr\" (UID: \"bf739949-4ce7-4c9b-a0c6-5a25705080da\") " pod="kserve/odh-model-controller-696fc77849-5kjzr" Apr 23 16:48:04.782544 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:04.782522 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf739949-4ce7-4c9b-a0c6-5a25705080da-cert\") pod \"odh-model-controller-696fc77849-5kjzr\" (UID: \"bf739949-4ce7-4c9b-a0c6-5a25705080da\") " pod="kserve/odh-model-controller-696fc77849-5kjzr" Apr 23 16:48:05.031528 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:05.031430 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-5kjzr" Apr 23 16:48:05.155912 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:05.155886 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-5kjzr"] Apr 23 16:48:05.157498 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:48:05.157471 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf739949_4ce7_4c9b_a0c6_5a25705080da.slice/crio-61b74ba28faac88c71366803a9d0a7f6eb311677f4e7dbbd769b123d4ba0a670 WatchSource:0}: Error finding container 61b74ba28faac88c71366803a9d0a7f6eb311677f4e7dbbd769b123d4ba0a670: Status 404 returned error can't find the container with id 61b74ba28faac88c71366803a9d0a7f6eb311677f4e7dbbd769b123d4ba0a670 Apr 23 16:48:05.261012 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:05.260927 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-5kjzr" event={"ID":"bf739949-4ce7-4c9b-a0c6-5a25705080da","Type":"ContainerStarted","Data":"61b74ba28faac88c71366803a9d0a7f6eb311677f4e7dbbd769b123d4ba0a670"} Apr 23 16:48:05.262328 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:05.262275 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-xktt4" event={"ID":"897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf","Type":"ContainerStarted","Data":"7f1e2ade3ff06fa0f1be83b38f996628d941429c266901ce1f75533e8a497905"} Apr 23 16:48:07.273902 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:07.273847 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-xktt4" event={"ID":"897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf","Type":"ContainerStarted","Data":"e58ad3fc9dc46052a373075e737470b9ce35506fcc3973f506551fe4ae398c35"} Apr 23 16:48:07.274374 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:07.273928 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-xktt4" Apr 23 16:48:07.293354 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:07.293283 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-xktt4" podStartSLOduration=1.574085618 podStartE2EDuration="3.293267981s" podCreationTimestamp="2026-04-23 16:48:04 +0000 UTC" firstStartedPulling="2026-04-23 16:48:04.540425695 +0000 UTC m=+770.601499927" lastFinishedPulling="2026-04-23 16:48:06.259608057 +0000 UTC m=+772.320682290" observedRunningTime="2026-04-23 16:48:07.291358972 +0000 UTC m=+773.352433226" watchObservedRunningTime="2026-04-23 16:48:07.293267981 +0000 UTC m=+773.354342235" Apr 23 16:48:08.279127 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:08.279087 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-5kjzr" event={"ID":"bf739949-4ce7-4c9b-a0c6-5a25705080da","Type":"ContainerStarted","Data":"7102847f1e69e50e450b9a51fac01d8c038b8e4048d247d9f5992b9cd361abc8"} Apr 23 16:48:08.279652 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:08.279221 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-5kjzr" Apr 23 16:48:08.296803 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:08.296741 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-5kjzr" podStartSLOduration=1.695135695 podStartE2EDuration="4.296725036s" podCreationTimestamp="2026-04-23 16:48:04 +0000 UTC" firstStartedPulling="2026-04-23 16:48:05.158887228 +0000 UTC m=+771.219961460" lastFinishedPulling="2026-04-23 16:48:07.760476563 +0000 UTC m=+773.821550801" observedRunningTime="2026-04-23 16:48:08.29508065 +0000 UTC m=+774.356154906" watchObservedRunningTime="2026-04-23 16:48:08.296725036 +0000 UTC m=+774.357799335" Apr 23 16:48:18.283769 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:18.283736 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-xktt4" Apr 23 16:48:19.286437 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:19.286406 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-5kjzr" Apr 23 16:48:20.139425 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:20.139392 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-rzp4s"] Apr 23 16:48:20.142582 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:20.142566 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rzp4s" Apr 23 16:48:20.149711 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:20.149630 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-rzp4s"] Apr 23 16:48:20.219479 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:20.219447 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l64s\" (UniqueName: \"kubernetes.io/projected/fef1709f-12fd-47ac-a6f6-7bf5822b7678-kube-api-access-9l64s\") pod \"s3-init-rzp4s\" (UID: \"fef1709f-12fd-47ac-a6f6-7bf5822b7678\") " pod="kserve/s3-init-rzp4s" Apr 23 16:48:20.320377 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:20.320332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l64s\" (UniqueName: \"kubernetes.io/projected/fef1709f-12fd-47ac-a6f6-7bf5822b7678-kube-api-access-9l64s\") pod \"s3-init-rzp4s\" (UID: \"fef1709f-12fd-47ac-a6f6-7bf5822b7678\") " pod="kserve/s3-init-rzp4s" Apr 23 16:48:20.329613 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:20.329580 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l64s\" (UniqueName: \"kubernetes.io/projected/fef1709f-12fd-47ac-a6f6-7bf5822b7678-kube-api-access-9l64s\") pod \"s3-init-rzp4s\" (UID: \"fef1709f-12fd-47ac-a6f6-7bf5822b7678\") " pod="kserve/s3-init-rzp4s" Apr 23 16:48:20.452270 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:20.452181 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rzp4s" Apr 23 16:48:20.784279 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:20.784253 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-rzp4s"] Apr 23 16:48:20.786234 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:48:20.786206 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfef1709f_12fd_47ac_a6f6_7bf5822b7678.slice/crio-def686d8f587892898c7d7d53c6f66878ad95e4dc13e14ac4af9491a8274eba9 WatchSource:0}: Error finding container def686d8f587892898c7d7d53c6f66878ad95e4dc13e14ac4af9491a8274eba9: Status 404 returned error can't find the container with id def686d8f587892898c7d7d53c6f66878ad95e4dc13e14ac4af9491a8274eba9 Apr 23 16:48:21.344309 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:21.344252 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rzp4s" event={"ID":"fef1709f-12fd-47ac-a6f6-7bf5822b7678","Type":"ContainerStarted","Data":"def686d8f587892898c7d7d53c6f66878ad95e4dc13e14ac4af9491a8274eba9"} Apr 23 16:48:26.367989 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:26.367950 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rzp4s" event={"ID":"fef1709f-12fd-47ac-a6f6-7bf5822b7678","Type":"ContainerStarted","Data":"e1f6546bc6842c91e98bf0218ea0a88c9700f5eb81bd7b9e7d9b9083dffe1f23"} Apr 23 16:48:29.380971 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:29.380938 2580 generic.go:358] "Generic (PLEG): container finished" podID="fef1709f-12fd-47ac-a6f6-7bf5822b7678" containerID="e1f6546bc6842c91e98bf0218ea0a88c9700f5eb81bd7b9e7d9b9083dffe1f23" exitCode=0 Apr 23 16:48:29.381380 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:29.381016 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rzp4s" event={"ID":"fef1709f-12fd-47ac-a6f6-7bf5822b7678","Type":"ContainerDied","Data":"e1f6546bc6842c91e98bf0218ea0a88c9700f5eb81bd7b9e7d9b9083dffe1f23"} Apr 23 16:48:30.516314 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:30.516271 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rzp4s" Apr 23 16:48:30.612424 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:30.612386 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l64s\" (UniqueName: \"kubernetes.io/projected/fef1709f-12fd-47ac-a6f6-7bf5822b7678-kube-api-access-9l64s\") pod \"fef1709f-12fd-47ac-a6f6-7bf5822b7678\" (UID: \"fef1709f-12fd-47ac-a6f6-7bf5822b7678\") " Apr 23 16:48:30.614525 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:30.614486 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fef1709f-12fd-47ac-a6f6-7bf5822b7678-kube-api-access-9l64s" (OuterVolumeSpecName: "kube-api-access-9l64s") pod "fef1709f-12fd-47ac-a6f6-7bf5822b7678" (UID: "fef1709f-12fd-47ac-a6f6-7bf5822b7678"). InnerVolumeSpecName "kube-api-access-9l64s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:48:30.713777 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:30.713672 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9l64s\" (UniqueName: \"kubernetes.io/projected/fef1709f-12fd-47ac-a6f6-7bf5822b7678-kube-api-access-9l64s\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:48:31.390352 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:31.390316 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rzp4s" event={"ID":"fef1709f-12fd-47ac-a6f6-7bf5822b7678","Type":"ContainerDied","Data":"def686d8f587892898c7d7d53c6f66878ad95e4dc13e14ac4af9491a8274eba9"} Apr 23 16:48:31.390352 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:31.390335 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rzp4s" Apr 23 16:48:31.390352 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:31.390353 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="def686d8f587892898c7d7d53c6f66878ad95e4dc13e14ac4af9491a8274eba9" Apr 23 16:48:59.249319 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.249244 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw"] Apr 23 16:48:59.250210 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.249883 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fef1709f-12fd-47ac-a6f6-7bf5822b7678" containerName="s3-init" Apr 23 16:48:59.250210 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.249910 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef1709f-12fd-47ac-a6f6-7bf5822b7678" containerName="s3-init" Apr 23 16:48:59.250210 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.250023 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="fef1709f-12fd-47ac-a6f6-7bf5822b7678" containerName="s3-init" Apr 23 16:48:59.364600 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.364564 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw"] Apr 23 16:48:59.364756 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.364720 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.367624 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.367595 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 16:48:59.367753 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.367605 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 16:48:59.368753 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.368728 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b4jmv\"" Apr 23 16:48:59.368889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.368769 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 23 16:48:59.445625 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.445587 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcqgw\" (UniqueName: \"kubernetes.io/projected/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kube-api-access-vcqgw\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.445839 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.445661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.445839 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.445713 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-model-cache\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.445839 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.445751 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.445839 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.445782 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-home\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.446009 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.445856 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-dshm\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.546406 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.546314 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.546406 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.546363 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-model-cache\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.546406 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.546390 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.546680 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.546417 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-home\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.546680 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.546553 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-dshm\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.546680 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.546623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcqgw\" (UniqueName: \"kubernetes.io/projected/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kube-api-access-vcqgw\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.546840 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.546787 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-model-cache\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.546840 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.546821 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-home\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.546950 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.546846 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.548683 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.548659 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-dshm\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.548815 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.548794 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.555056 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.555034 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcqgw\" (UniqueName: \"kubernetes.io/projected/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kube-api-access-vcqgw\") pod \"scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.675370 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.675331 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:48:59.809453 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:48:59.809426 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw"] Apr 23 16:48:59.811734 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:48:59.811691 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1230f84c_3378_48ee_92bd_8cfe6b10c7fd.slice/crio-acd1f5e24ed4c0dcb1460c64cf319e2fa79a90eb6f502b8a8482a6e4a63e3591 WatchSource:0}: Error finding container acd1f5e24ed4c0dcb1460c64cf319e2fa79a90eb6f502b8a8482a6e4a63e3591: Status 404 returned error can't find the container with id acd1f5e24ed4c0dcb1460c64cf319e2fa79a90eb6f502b8a8482a6e4a63e3591 Apr 23 16:49:00.505935 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:00.505896 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" event={"ID":"1230f84c-3378-48ee-92bd-8cfe6b10c7fd","Type":"ContainerStarted","Data":"acd1f5e24ed4c0dcb1460c64cf319e2fa79a90eb6f502b8a8482a6e4a63e3591"} Apr 23 16:49:03.520556 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:03.520459 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" event={"ID":"1230f84c-3378-48ee-92bd-8cfe6b10c7fd","Type":"ContainerStarted","Data":"5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083"} Apr 23 16:49:07.542449 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:07.542360 2580 generic.go:358] "Generic (PLEG): container finished" podID="1230f84c-3378-48ee-92bd-8cfe6b10c7fd" containerID="5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083" exitCode=0 Apr 23 16:49:07.542449 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:07.542406 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" event={"ID":"1230f84c-3378-48ee-92bd-8cfe6b10c7fd","Type":"ContainerDied","Data":"5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083"} Apr 23 16:49:09.555059 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:09.555023 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" event={"ID":"1230f84c-3378-48ee-92bd-8cfe6b10c7fd","Type":"ContainerStarted","Data":"153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab"} Apr 23 16:49:09.576805 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:09.576751 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" podStartSLOduration=1.672878831 podStartE2EDuration="10.576736701s" podCreationTimestamp="2026-04-23 16:48:59 +0000 UTC" firstStartedPulling="2026-04-23 16:48:59.813468873 +0000 UTC m=+825.874543104" lastFinishedPulling="2026-04-23 16:49:08.717326721 +0000 UTC m=+834.778400974" observedRunningTime="2026-04-23 16:49:09.573772415 +0000 UTC m=+835.634846670" watchObservedRunningTime="2026-04-23 16:49:09.576736701 +0000 UTC m=+835.637810955" Apr 23 16:49:09.675634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:09.675597 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:49:09.675808 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:09.675727 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:49:09.688356 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:09.688327 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:49:10.570674 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:10.570643 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:49:12.560232 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.560197 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk"] Apr 23 16:49:12.569669 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.569635 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.573643 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.573606 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk"] Apr 23 16:49:12.573960 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.573942 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 23 16:49:12.665845 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.665793 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a52bc069-6503-4fff-a388-aa87a1651480-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.666053 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.665855 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.666053 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.665914 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.666053 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.665976 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.666053 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.666004 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.666286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.666083 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q7h9\" (UniqueName: \"kubernetes.io/projected/a52bc069-6503-4fff-a388-aa87a1651480-kube-api-access-7q7h9\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.766947 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.766913 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a52bc069-6503-4fff-a388-aa87a1651480-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.767111 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.766953 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.767111 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.766989 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.767111 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.767021 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.767111 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.767041 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.767111 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.767082 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q7h9\" (UniqueName: \"kubernetes.io/projected/a52bc069-6503-4fff-a388-aa87a1651480-kube-api-access-7q7h9\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.767464 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.767431 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.767574 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.767551 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.767803 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.767772 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.769234 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.769212 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a52bc069-6503-4fff-a388-aa87a1651480-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.769333 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.769220 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.775068 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.775041 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q7h9\" (UniqueName: \"kubernetes.io/projected/a52bc069-6503-4fff-a388-aa87a1651480-kube-api-access-7q7h9\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:12.883283 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:12.883178 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:13.015258 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:13.015226 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk"] Apr 23 16:49:13.016931 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:49:13.016902 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda52bc069_6503_4fff_a388_aa87a1651480.slice/crio-34234b7cb92ae250cb81ecd73b9e8461ad30c818e4d59097853ba7525f05d16c WatchSource:0}: Error finding container 34234b7cb92ae250cb81ecd73b9e8461ad30c818e4d59097853ba7525f05d16c: Status 404 returned error can't find the container with id 34234b7cb92ae250cb81ecd73b9e8461ad30c818e4d59097853ba7525f05d16c Apr 23 16:49:13.573725 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:13.573684 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" event={"ID":"a52bc069-6503-4fff-a388-aa87a1651480","Type":"ContainerStarted","Data":"ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b"} Apr 23 16:49:13.573725 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:13.573723 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" event={"ID":"a52bc069-6503-4fff-a388-aa87a1651480","Type":"ContainerStarted","Data":"34234b7cb92ae250cb81ecd73b9e8461ad30c818e4d59097853ba7525f05d16c"} Apr 23 16:49:17.590734 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:17.590696 2580 generic.go:358] "Generic (PLEG): container finished" podID="a52bc069-6503-4fff-a388-aa87a1651480" containerID="ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b" exitCode=0 Apr 23 16:49:17.591167 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:17.590771 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" event={"ID":"a52bc069-6503-4fff-a388-aa87a1651480","Type":"ContainerDied","Data":"ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b"} Apr 23 16:49:18.596352 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:18.596317 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" event={"ID":"a52bc069-6503-4fff-a388-aa87a1651480","Type":"ContainerStarted","Data":"8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d"} Apr 23 16:49:18.623429 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:18.623381 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" podStartSLOduration=6.623362611 podStartE2EDuration="6.623362611s" podCreationTimestamp="2026-04-23 16:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:49:18.623221017 +0000 UTC m=+844.684295282" watchObservedRunningTime="2026-04-23 16:49:18.623362611 +0000 UTC m=+844.684436864" Apr 23 16:49:22.883931 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:22.883886 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:22.884402 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:22.883946 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:22.900042 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:22.900019 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:23.627960 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:23.627930 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:49:51.018665 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.018628 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw"] Apr 23 16:49:51.019048 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.018992 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" podUID="1230f84c-3378-48ee-92bd-8cfe6b10c7fd" containerName="main" containerID="cri-o://153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab" gracePeriod=30 Apr 23 16:49:51.275711 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.275656 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:49:51.303393 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.303362 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kserve-provision-location\") pod \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " Apr 23 16:49:51.303571 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.303424 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-model-cache\") pod \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " Apr 23 16:49:51.303571 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.303450 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-dshm\") pod \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " Apr 23 16:49:51.303571 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.303514 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-home\") pod \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " Apr 23 16:49:51.303571 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.303547 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcqgw\" (UniqueName: \"kubernetes.io/projected/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kube-api-access-vcqgw\") pod \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " Apr 23 16:49:51.303797 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.303598 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-tls-certs\") pod \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\" (UID: \"1230f84c-3378-48ee-92bd-8cfe6b10c7fd\") " Apr 23 16:49:51.303797 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.303692 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-model-cache" (OuterVolumeSpecName: "model-cache") pod "1230f84c-3378-48ee-92bd-8cfe6b10c7fd" (UID: "1230f84c-3378-48ee-92bd-8cfe6b10c7fd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:49:51.303797 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.303710 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-home" (OuterVolumeSpecName: "home") pod "1230f84c-3378-48ee-92bd-8cfe6b10c7fd" (UID: "1230f84c-3378-48ee-92bd-8cfe6b10c7fd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:49:51.303965 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.303929 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:49:51.303965 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.303956 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:49:51.305907 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.305871 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1230f84c-3378-48ee-92bd-8cfe6b10c7fd" (UID: "1230f84c-3378-48ee-92bd-8cfe6b10c7fd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:49:51.305907 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.305887 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kube-api-access-vcqgw" (OuterVolumeSpecName: "kube-api-access-vcqgw") pod "1230f84c-3378-48ee-92bd-8cfe6b10c7fd" (UID: "1230f84c-3378-48ee-92bd-8cfe6b10c7fd"). InnerVolumeSpecName "kube-api-access-vcqgw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:49:51.306095 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.305956 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-dshm" (OuterVolumeSpecName: "dshm") pod "1230f84c-3378-48ee-92bd-8cfe6b10c7fd" (UID: "1230f84c-3378-48ee-92bd-8cfe6b10c7fd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:49:51.359675 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.359631 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1230f84c-3378-48ee-92bd-8cfe6b10c7fd" (UID: "1230f84c-3378-48ee-92bd-8cfe6b10c7fd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:49:51.404918 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.404886 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:49:51.404918 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.404917 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:49:51.405120 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.404927 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vcqgw\" (UniqueName: \"kubernetes.io/projected/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-kube-api-access-vcqgw\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:49:51.405120 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.404938 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1230f84c-3378-48ee-92bd-8cfe6b10c7fd-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:49:51.735077 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.735039 2580 generic.go:358] "Generic (PLEG): container finished" podID="1230f84c-3378-48ee-92bd-8cfe6b10c7fd" containerID="153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab" exitCode=0 Apr 23 16:49:51.735317 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.735119 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" Apr 23 16:49:51.735317 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.735130 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" event={"ID":"1230f84c-3378-48ee-92bd-8cfe6b10c7fd","Type":"ContainerDied","Data":"153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab"} Apr 23 16:49:51.735317 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.735181 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw" event={"ID":"1230f84c-3378-48ee-92bd-8cfe6b10c7fd","Type":"ContainerDied","Data":"acd1f5e24ed4c0dcb1460c64cf319e2fa79a90eb6f502b8a8482a6e4a63e3591"} Apr 23 16:49:51.735317 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.735205 2580 scope.go:117] "RemoveContainer" containerID="153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab" Apr 23 16:49:51.744543 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.744522 2580 scope.go:117] "RemoveContainer" containerID="5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083" Apr 23 16:49:51.768207 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.768179 2580 scope.go:117] "RemoveContainer" containerID="153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab" Apr 23 16:49:51.768607 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:49:51.768577 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab\": container with ID starting with 153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab not found: ID does not exist" containerID="153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab" Apr 23 16:49:51.768697 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.768628 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab"} err="failed to get container status \"153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab\": rpc error: code = NotFound desc = could not find container \"153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab\": container with ID starting with 153bf9c2285b57b419f7e0fd5fde7b3c263c2b7e7d95244dd9bfd2900adf20ab not found: ID does not exist" Apr 23 16:49:51.768697 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.768657 2580 scope.go:117] "RemoveContainer" containerID="5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083" Apr 23 16:49:51.768697 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.768586 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw"] Apr 23 16:49:51.768983 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:49:51.768959 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083\": container with ID starting with 5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083 not found: ID does not exist" containerID="5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083" Apr 23 16:49:51.769035 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.768995 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083"} err="failed to get container status \"5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083\": rpc error: code = NotFound desc = could not find container \"5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083\": container with ID starting with 5421584bd7f8055f9edea965c6e18c7736a0be05730f58e00f2c63dc1eee2083 not found: ID does not exist" Apr 23 16:49:51.778789 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:51.778765 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-548bbdfdd7-gvktw"] Apr 23 16:49:52.537587 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:49:52.537554 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1230f84c-3378-48ee-92bd-8cfe6b10c7fd" path="/var/lib/kubelet/pods/1230f84c-3378-48ee-92bd-8cfe6b10c7fd/volumes" Apr 23 16:50:11.796360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.796316 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7"] Apr 23 16:50:11.796812 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.796712 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1230f84c-3378-48ee-92bd-8cfe6b10c7fd" containerName="storage-initializer" Apr 23 16:50:11.796812 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.796724 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1230f84c-3378-48ee-92bd-8cfe6b10c7fd" containerName="storage-initializer" Apr 23 16:50:11.796812 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.796737 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1230f84c-3378-48ee-92bd-8cfe6b10c7fd" containerName="main" Apr 23 16:50:11.796812 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.796744 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1230f84c-3378-48ee-92bd-8cfe6b10c7fd" containerName="main" Apr 23 16:50:11.796812 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.796809 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1230f84c-3378-48ee-92bd-8cfe6b10c7fd" containerName="main" Apr 23 16:50:11.806353 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.806327 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.810185 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.810155 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 23 16:50:11.810608 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.810579 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-bz9j9\"" Apr 23 16:50:11.830943 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.830915 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7"] Apr 23 16:50:11.881614 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.881579 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.881771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.881629 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.881771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.881658 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.881771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.881674 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.881771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.881701 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.881771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.881730 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn9p5\" (UniqueName: \"kubernetes.io/projected/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kube-api-access-dn9p5\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.982361 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.982319 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.982570 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.982378 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.982570 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.982404 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.982570 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.982420 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.982570 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.982447 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.982570 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.982479 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn9p5\" (UniqueName: \"kubernetes.io/projected/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kube-api-access-dn9p5\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.982857 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.982800 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.982857 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.982843 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.982931 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.982854 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.982966 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.982943 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:11.984833 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:11.984811 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:12.000086 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:12.000053 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn9p5\" (UniqueName: \"kubernetes.io/projected/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kube-api-access-dn9p5\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:12.116644 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:12.116556 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:12.246807 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:12.246773 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7"] Apr 23 16:50:12.247978 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:50:12.247947 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2759a87c_a3a4_4fb8_85b3_0efe502c107d.slice/crio-c2d123b9aa2ce10a37cc64ab29056a5b88d14a1a5a8cdb03204144f3d1e45a40 WatchSource:0}: Error finding container c2d123b9aa2ce10a37cc64ab29056a5b88d14a1a5a8cdb03204144f3d1e45a40: Status 404 returned error can't find the container with id c2d123b9aa2ce10a37cc64ab29056a5b88d14a1a5a8cdb03204144f3d1e45a40 Apr 23 16:50:12.818258 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:12.818220 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" event={"ID":"2759a87c-a3a4-4fb8-85b3-0efe502c107d","Type":"ContainerStarted","Data":"b1293f292410fa7c438fff9dc2e6a5dc9f03ce6371d402261e117359e9c5136b"} Apr 23 16:50:12.818258 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:12.818265 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" event={"ID":"2759a87c-a3a4-4fb8-85b3-0efe502c107d","Type":"ContainerStarted","Data":"c2d123b9aa2ce10a37cc64ab29056a5b88d14a1a5a8cdb03204144f3d1e45a40"} Apr 23 16:50:13.442465 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.442430 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk"] Apr 23 16:50:13.442775 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.442748 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" podUID="a52bc069-6503-4fff-a388-aa87a1651480" containerName="main" containerID="cri-o://8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d" gracePeriod=30 Apr 23 16:50:13.706468 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.706440 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:50:13.799158 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.799117 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-home\") pod \"a52bc069-6503-4fff-a388-aa87a1651480\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " Apr 23 16:50:13.799390 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.799193 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q7h9\" (UniqueName: \"kubernetes.io/projected/a52bc069-6503-4fff-a388-aa87a1651480-kube-api-access-7q7h9\") pod \"a52bc069-6503-4fff-a388-aa87a1651480\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " Apr 23 16:50:13.799390 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.799248 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a52bc069-6503-4fff-a388-aa87a1651480-tls-certs\") pod \"a52bc069-6503-4fff-a388-aa87a1651480\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " Apr 23 16:50:13.799390 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.799318 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-kserve-provision-location\") pod \"a52bc069-6503-4fff-a388-aa87a1651480\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " Apr 23 16:50:13.799390 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.799341 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-dshm\") pod \"a52bc069-6503-4fff-a388-aa87a1651480\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " Apr 23 16:50:13.799390 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.799367 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-model-cache\") pod \"a52bc069-6503-4fff-a388-aa87a1651480\" (UID: \"a52bc069-6503-4fff-a388-aa87a1651480\") " Apr 23 16:50:13.799674 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.799448 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-home" (OuterVolumeSpecName: "home") pod "a52bc069-6503-4fff-a388-aa87a1651480" (UID: "a52bc069-6503-4fff-a388-aa87a1651480"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:50:13.799727 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.799672 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:50:13.799769 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.799735 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-model-cache" (OuterVolumeSpecName: "model-cache") pod "a52bc069-6503-4fff-a388-aa87a1651480" (UID: "a52bc069-6503-4fff-a388-aa87a1651480"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:50:13.801558 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.801526 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-dshm" (OuterVolumeSpecName: "dshm") pod "a52bc069-6503-4fff-a388-aa87a1651480" (UID: "a52bc069-6503-4fff-a388-aa87a1651480"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:50:13.801859 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.801827 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52bc069-6503-4fff-a388-aa87a1651480-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a52bc069-6503-4fff-a388-aa87a1651480" (UID: "a52bc069-6503-4fff-a388-aa87a1651480"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:50:13.801859 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.801828 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52bc069-6503-4fff-a388-aa87a1651480-kube-api-access-7q7h9" (OuterVolumeSpecName: "kube-api-access-7q7h9") pod "a52bc069-6503-4fff-a388-aa87a1651480" (UID: "a52bc069-6503-4fff-a388-aa87a1651480"). InnerVolumeSpecName "kube-api-access-7q7h9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:50:13.823645 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.823614 2580 generic.go:358] "Generic (PLEG): container finished" podID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerID="b1293f292410fa7c438fff9dc2e6a5dc9f03ce6371d402261e117359e9c5136b" exitCode=0 Apr 23 16:50:13.824027 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.823694 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" event={"ID":"2759a87c-a3a4-4fb8-85b3-0efe502c107d","Type":"ContainerDied","Data":"b1293f292410fa7c438fff9dc2e6a5dc9f03ce6371d402261e117359e9c5136b"} Apr 23 16:50:13.825420 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.825386 2580 generic.go:358] "Generic (PLEG): container finished" podID="a52bc069-6503-4fff-a388-aa87a1651480" containerID="8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d" exitCode=0 Apr 23 16:50:13.825493 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.825434 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" event={"ID":"a52bc069-6503-4fff-a388-aa87a1651480","Type":"ContainerDied","Data":"8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d"} Apr 23 16:50:13.825493 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.825458 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" event={"ID":"a52bc069-6503-4fff-a388-aa87a1651480","Type":"ContainerDied","Data":"34234b7cb92ae250cb81ecd73b9e8461ad30c818e4d59097853ba7525f05d16c"} Apr 23 16:50:13.825493 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.825471 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" Apr 23 16:50:13.825493 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.825478 2580 scope.go:117] "RemoveContainer" containerID="8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d" Apr 23 16:50:13.836131 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.836110 2580 scope.go:117] "RemoveContainer" containerID="ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b" Apr 23 16:50:13.860366 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.860335 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a52bc069-6503-4fff-a388-aa87a1651480" (UID: "a52bc069-6503-4fff-a388-aa87a1651480"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:50:13.900649 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.900611 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a52bc069-6503-4fff-a388-aa87a1651480-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:50:13.900649 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.900643 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:50:13.900649 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.900656 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:50:13.900918 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.900671 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a52bc069-6503-4fff-a388-aa87a1651480-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:50:13.900918 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.900685 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7q7h9\" (UniqueName: \"kubernetes.io/projected/a52bc069-6503-4fff-a388-aa87a1651480-kube-api-access-7q7h9\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:50:13.901275 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.901254 2580 scope.go:117] "RemoveContainer" containerID="8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d" Apr 23 16:50:13.901612 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:50:13.901592 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d\": container with ID starting with 8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d not found: ID does not exist" containerID="8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d" Apr 23 16:50:13.901657 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.901620 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d"} err="failed to get container status \"8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d\": rpc error: code = NotFound desc = could not find container \"8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d\": container with ID starting with 8812787ac50914c12bb7a27a9df7b8a912acc27da8931ee3463b682e93b59f6d not found: ID does not exist" Apr 23 16:50:13.901657 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.901643 2580 scope.go:117] "RemoveContainer" containerID="ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b" Apr 23 16:50:13.901921 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:50:13.901900 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b\": container with ID starting with ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b not found: ID does not exist" containerID="ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b" Apr 23 16:50:13.901968 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:13.901928 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b"} err="failed to get container status \"ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b\": rpc error: code = NotFound desc = could not find container \"ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b\": container with ID starting with ab19625ab0ec09790225461e8f8171f4d6f6ab258f6921732b5edb19c207e05b not found: ID does not exist" Apr 23 16:50:14.148557 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:14.148523 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk"] Apr 23 16:50:14.152170 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:14.152146 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk"] Apr 23 16:50:14.520505 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:14.520478 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:50:14.520603 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:14.520477 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:50:14.540433 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:14.540401 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52bc069-6503-4fff-a388-aa87a1651480" path="/var/lib/kubelet/pods/a52bc069-6503-4fff-a388-aa87a1651480/volumes" Apr 23 16:50:15.837572 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:15.837433 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" event={"ID":"2759a87c-a3a4-4fb8-85b3-0efe502c107d","Type":"ContainerStarted","Data":"49370aff657abe1cf277857122fab65cdc7a23ced97d7638260ed8020f9c5805"} Apr 23 16:50:18.617943 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:18.617891 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6f555c65757m6xk" podUID="a52bc069-6503-4fff-a388-aa87a1651480" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": context deadline exceeded" Apr 23 16:50:28.152343 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.152300 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4"] Apr 23 16:50:28.152843 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.152817 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a52bc069-6503-4fff-a388-aa87a1651480" containerName="main" Apr 23 16:50:28.152843 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.152838 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52bc069-6503-4fff-a388-aa87a1651480" containerName="main" Apr 23 16:50:28.152975 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.152862 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a52bc069-6503-4fff-a388-aa87a1651480" containerName="storage-initializer" Apr 23 16:50:28.152975 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.152872 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52bc069-6503-4fff-a388-aa87a1651480" containerName="storage-initializer" Apr 23 16:50:28.152975 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.152974 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a52bc069-6503-4fff-a388-aa87a1651480" containerName="main" Apr 23 16:50:28.156862 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.156841 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.159824 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.159802 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 23 16:50:28.165382 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.165348 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4"] Apr 23 16:50:28.240088 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.240051 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.240312 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.240113 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.240312 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.240275 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.240439 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.240355 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5zfb\" (UniqueName: \"kubernetes.io/projected/e081da07-6ba7-41fa-925c-1c9264885cff-kube-api-access-r5zfb\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.240439 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.240409 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.240530 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.240442 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e081da07-6ba7-41fa-925c-1c9264885cff-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.341041 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.341000 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e081da07-6ba7-41fa-925c-1c9264885cff-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.341237 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.341052 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.341237 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.341094 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.341237 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.341162 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.341237 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.341187 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5zfb\" (UniqueName: \"kubernetes.io/projected/e081da07-6ba7-41fa-925c-1c9264885cff-kube-api-access-r5zfb\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.341237 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.341222 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.341782 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.341732 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.341782 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.341758 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.341963 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.341835 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.343806 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.343779 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.344030 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.344008 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e081da07-6ba7-41fa-925c-1c9264885cff-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.349715 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.349690 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5zfb\" (UniqueName: \"kubernetes.io/projected/e081da07-6ba7-41fa-925c-1c9264885cff-kube-api-access-r5zfb\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.472582 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.472152 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:50:28.852986 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.852925 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4"] Apr 23 16:50:28.854191 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:50:28.854152 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode081da07_6ba7_41fa_925c_1c9264885cff.slice/crio-cd33191771181f75772e820b0aecb3a3df453b881da6b097f5dbf65864b4c286 WatchSource:0}: Error finding container cd33191771181f75772e820b0aecb3a3df453b881da6b097f5dbf65864b4c286: Status 404 returned error can't find the container with id cd33191771181f75772e820b0aecb3a3df453b881da6b097f5dbf65864b4c286 Apr 23 16:50:28.903459 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:28.903420 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" event={"ID":"e081da07-6ba7-41fa-925c-1c9264885cff","Type":"ContainerStarted","Data":"cd33191771181f75772e820b0aecb3a3df453b881da6b097f5dbf65864b4c286"} Apr 23 16:50:29.910549 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:29.910485 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" event={"ID":"e081da07-6ba7-41fa-925c-1c9264885cff","Type":"ContainerStarted","Data":"0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e"} Apr 23 16:50:44.979375 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:44.979336 2580 generic.go:358] "Generic (PLEG): container finished" podID="e081da07-6ba7-41fa-925c-1c9264885cff" containerID="0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e" exitCode=0 Apr 23 16:50:44.979799 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:44.979383 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" event={"ID":"e081da07-6ba7-41fa-925c-1c9264885cff","Type":"ContainerDied","Data":"0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e"} Apr 23 16:50:46.996860 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:46.996760 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" event={"ID":"2759a87c-a3a4-4fb8-85b3-0efe502c107d","Type":"ContainerStarted","Data":"a22ddec9d8472f8a7c5b02b0d3a9363fbc2baf4d7198c90b812b09dcc52fe02e"} Apr 23 16:50:46.997905 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:46.997857 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:47.001683 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:47.001409 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 23 16:50:47.028939 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:47.028840 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" podStartSLOduration=3.983934918 podStartE2EDuration="36.028823368s" podCreationTimestamp="2026-04-23 16:50:11 +0000 UTC" firstStartedPulling="2026-04-23 16:50:13.825005985 +0000 UTC m=+899.886080217" lastFinishedPulling="2026-04-23 16:50:45.869894434 +0000 UTC m=+931.930968667" observedRunningTime="2026-04-23 16:50:47.023007261 +0000 UTC m=+933.084081526" watchObservedRunningTime="2026-04-23 16:50:47.028823368 +0000 UTC m=+933.089897624" Apr 23 16:50:48.004976 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:48.004796 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 23 16:50:49.010459 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:49.010417 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 23 16:50:52.117443 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:52.117082 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:52.117443 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:52.117145 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:50:52.118021 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:52.117491 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.53:8082/healthz\": dial tcp 10.132.0.53:8082: connect: connection refused" Apr 23 16:50:52.119037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:50:52.119005 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 23 16:51:01.256346 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:01.256284 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7"] Apr 23 16:51:01.257266 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:01.257211 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="main" containerID="cri-o://49370aff657abe1cf277857122fab65cdc7a23ced97d7638260ed8020f9c5805" gracePeriod=30 Apr 23 16:51:01.257631 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:01.257318 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="tokenizer" containerID="cri-o://a22ddec9d8472f8a7c5b02b0d3a9363fbc2baf4d7198c90b812b09dcc52fe02e" gracePeriod=30 Apr 23 16:51:01.259665 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:01.259595 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 23 16:51:04.078922 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:04.078835 2580 generic.go:358] "Generic (PLEG): container finished" podID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerID="a22ddec9d8472f8a7c5b02b0d3a9363fbc2baf4d7198c90b812b09dcc52fe02e" exitCode=0 Apr 23 16:51:04.078922 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:04.078868 2580 generic.go:358] "Generic (PLEG): container finished" podID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerID="49370aff657abe1cf277857122fab65cdc7a23ced97d7638260ed8020f9c5805" exitCode=0 Apr 23 16:51:04.078922 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:04.078897 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" event={"ID":"2759a87c-a3a4-4fb8-85b3-0efe502c107d","Type":"ContainerDied","Data":"a22ddec9d8472f8a7c5b02b0d3a9363fbc2baf4d7198c90b812b09dcc52fe02e"} Apr 23 16:51:04.079485 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:04.078946 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" event={"ID":"2759a87c-a3a4-4fb8-85b3-0efe502c107d","Type":"ContainerDied","Data":"49370aff657abe1cf277857122fab65cdc7a23ced97d7638260ed8020f9c5805"} Apr 23 16:51:05.279969 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.279944 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:51:05.391898 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.391818 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-cache\") pod \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " Apr 23 16:51:05.391898 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.391874 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-tmp\") pod \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " Apr 23 16:51:05.392097 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.391904 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tls-certs\") pod \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " Apr 23 16:51:05.392097 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.391954 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kserve-provision-location\") pod \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " Apr 23 16:51:05.392097 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.391984 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn9p5\" (UniqueName: \"kubernetes.io/projected/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kube-api-access-dn9p5\") pod \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " Apr 23 16:51:05.392097 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.392088 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-uds\") pod \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\" (UID: \"2759a87c-a3a4-4fb8-85b3-0efe502c107d\") " Apr 23 16:51:05.392342 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.392144 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2759a87c-a3a4-4fb8-85b3-0efe502c107d" (UID: "2759a87c-a3a4-4fb8-85b3-0efe502c107d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:05.392342 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.392273 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2759a87c-a3a4-4fb8-85b3-0efe502c107d" (UID: "2759a87c-a3a4-4fb8-85b3-0efe502c107d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:05.392469 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.392404 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2759a87c-a3a4-4fb8-85b3-0efe502c107d" (UID: "2759a87c-a3a4-4fb8-85b3-0efe502c107d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:05.392530 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.392472 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:05.392530 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.392496 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-tmp\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:05.392877 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.392844 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2759a87c-a3a4-4fb8-85b3-0efe502c107d" (UID: "2759a87c-a3a4-4fb8-85b3-0efe502c107d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:05.394172 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.394143 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2759a87c-a3a4-4fb8-85b3-0efe502c107d" (UID: "2759a87c-a3a4-4fb8-85b3-0efe502c107d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:51:05.394372 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.394350 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kube-api-access-dn9p5" (OuterVolumeSpecName: "kube-api-access-dn9p5") pod "2759a87c-a3a4-4fb8-85b3-0efe502c107d" (UID: "2759a87c-a3a4-4fb8-85b3-0efe502c107d"). InnerVolumeSpecName "kube-api-access-dn9p5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:51:05.493951 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.493914 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:05.493951 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.493948 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:05.494176 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.493964 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dn9p5\" (UniqueName: \"kubernetes.io/projected/2759a87c-a3a4-4fb8-85b3-0efe502c107d-kube-api-access-dn9p5\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:05.494176 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:05.493978 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2759a87c-a3a4-4fb8-85b3-0efe502c107d-tokenizer-uds\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:06.090400 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:06.090355 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" event={"ID":"2759a87c-a3a4-4fb8-85b3-0efe502c107d","Type":"ContainerDied","Data":"c2d123b9aa2ce10a37cc64ab29056a5b88d14a1a5a8cdb03204144f3d1e45a40"} Apr 23 16:51:06.090586 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:06.090422 2580 scope.go:117] "RemoveContainer" containerID="a22ddec9d8472f8a7c5b02b0d3a9363fbc2baf4d7198c90b812b09dcc52fe02e" Apr 23 16:51:06.090586 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:06.090372 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7" Apr 23 16:51:06.102469 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:06.102447 2580 scope.go:117] "RemoveContainer" containerID="49370aff657abe1cf277857122fab65cdc7a23ced97d7638260ed8020f9c5805" Apr 23 16:51:06.113365 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:06.113340 2580 scope.go:117] "RemoveContainer" containerID="b1293f292410fa7c438fff9dc2e6a5dc9f03ce6371d402261e117359e9c5136b" Apr 23 16:51:06.117510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:06.117463 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7"] Apr 23 16:51:06.120617 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:06.120594 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f7h6h7"] Apr 23 16:51:06.538354 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:06.538313 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" path="/var/lib/kubelet/pods/2759a87c-a3a4-4fb8-85b3-0efe502c107d/volumes" Apr 23 16:51:08.958924 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.958891 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8"] Apr 23 16:51:08.959318 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.959299 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="tokenizer" Apr 23 16:51:08.959318 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.959315 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="tokenizer" Apr 23 16:51:08.959400 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.959324 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="main" Apr 23 16:51:08.959400 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.959330 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="main" Apr 23 16:51:08.959400 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.959351 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="storage-initializer" Apr 23 16:51:08.959400 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.959358 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="storage-initializer" Apr 23 16:51:08.959530 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.959418 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="main" Apr 23 16:51:08.959530 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.959427 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="2759a87c-a3a4-4fb8-85b3-0efe502c107d" containerName="tokenizer" Apr 23 16:51:08.993046 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.993012 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8"] Apr 23 16:51:08.993236 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.993171 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:08.996172 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:08.996146 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 23 16:51:09.126895 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.126852 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-dshm\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.126895 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.126896 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.127116 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.126926 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43ede598-957f-413f-a2bb-70e1f52d25fe-tls-certs\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.127116 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.126947 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k99p7\" (UniqueName: \"kubernetes.io/projected/43ede598-957f-413f-a2bb-70e1f52d25fe-kube-api-access-k99p7\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.127116 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.127013 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-home\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.127116 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.127070 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-model-cache\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.228075 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.228035 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-home\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.228262 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.228095 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-model-cache\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.228262 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.228171 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-dshm\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.228262 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.228201 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.228262 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.228243 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43ede598-957f-413f-a2bb-70e1f52d25fe-tls-certs\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.228558 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.228272 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k99p7\" (UniqueName: \"kubernetes.io/projected/43ede598-957f-413f-a2bb-70e1f52d25fe-kube-api-access-k99p7\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.228558 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.228525 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-home\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.228973 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.228924 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.229092 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.229007 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-model-cache\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.231360 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.231282 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-dshm\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.232211 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.232168 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43ede598-957f-413f-a2bb-70e1f52d25fe-tls-certs\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.233006 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.232974 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt"] Apr 23 16:51:09.239068 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.239042 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k99p7\" (UniqueName: \"kubernetes.io/projected/43ede598-957f-413f-a2bb-70e1f52d25fe-kube-api-access-k99p7\") pod \"precise-prefix-cache-test-kserve-589449d9f5-s6fk8\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.264194 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.264153 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt"] Apr 23 16:51:09.264434 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.264413 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.267131 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.267104 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-b8dfz\"" Apr 23 16:51:09.306004 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.305970 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:09.429948 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.429912 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.430111 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.429956 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.430111 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.430029 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.430111 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.430063 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.430283 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.430152 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng6vt\" (UniqueName: \"kubernetes.io/projected/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kube-api-access-ng6vt\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.430283 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.430273 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.442038 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.442014 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8"] Apr 23 16:51:09.442433 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:51:09.442412 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43ede598_957f_413f_a2bb_70e1f52d25fe.slice/crio-a6c45c852dd6036748b80be88a03932deb73f6255a1a234394dd065d3c59ea4a WatchSource:0}: Error finding container a6c45c852dd6036748b80be88a03932deb73f6255a1a234394dd065d3c59ea4a: Status 404 returned error can't find the container with id a6c45c852dd6036748b80be88a03932deb73f6255a1a234394dd065d3c59ea4a Apr 23 16:51:09.531719 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.531685 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.531882 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.531731 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.531882 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.531765 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.531882 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.531805 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.531882 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.531832 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.531882 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.531877 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng6vt\" (UniqueName: \"kubernetes.io/projected/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kube-api-access-ng6vt\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.532184 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.532157 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.532246 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.532209 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.532546 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.532524 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.532628 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.532541 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.534595 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.534571 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.540910 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.540883 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng6vt\" (UniqueName: \"kubernetes.io/projected/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kube-api-access-ng6vt\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.574996 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.574966 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:09.706579 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.706527 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt"] Apr 23 16:51:09.711253 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:09.711231 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:51:10.110926 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:10.110789 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" event={"ID":"8c2041ac-d2b9-467c-95aa-1ff25244ac2b","Type":"ContainerStarted","Data":"b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d"} Apr 23 16:51:10.110926 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:10.110849 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" event={"ID":"8c2041ac-d2b9-467c-95aa-1ff25244ac2b","Type":"ContainerStarted","Data":"feb384418243127670b48f939ac1fff2cdfb61f9920835c7cdb175864345fc86"} Apr 23 16:51:10.113108 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:10.113056 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" event={"ID":"43ede598-957f-413f-a2bb-70e1f52d25fe","Type":"ContainerStarted","Data":"f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb"} Apr 23 16:51:10.113108 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:10.113097 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" event={"ID":"43ede598-957f-413f-a2bb-70e1f52d25fe","Type":"ContainerStarted","Data":"a6c45c852dd6036748b80be88a03932deb73f6255a1a234394dd065d3c59ea4a"} Apr 23 16:51:11.119245 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:11.119202 2580 generic.go:358] "Generic (PLEG): container finished" podID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerID="b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d" exitCode=0 Apr 23 16:51:11.119738 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:11.119283 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" event={"ID":"8c2041ac-d2b9-467c-95aa-1ff25244ac2b","Type":"ContainerDied","Data":"b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d"} Apr 23 16:51:12.126754 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:12.126712 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" event={"ID":"8c2041ac-d2b9-467c-95aa-1ff25244ac2b","Type":"ContainerStarted","Data":"ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10"} Apr 23 16:51:12.127242 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:12.126766 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" event={"ID":"8c2041ac-d2b9-467c-95aa-1ff25244ac2b","Type":"ContainerStarted","Data":"05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883"} Apr 23 16:51:12.127242 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:12.126886 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:12.153272 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:12.153190 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" podStartSLOduration=3.153173091 podStartE2EDuration="3.153173091s" podCreationTimestamp="2026-04-23 16:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:51:12.149581688 +0000 UTC m=+958.210655954" watchObservedRunningTime="2026-04-23 16:51:12.153173091 +0000 UTC m=+958.214247376" Apr 23 16:51:14.143391 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:14.143308 2580 generic.go:358] "Generic (PLEG): container finished" podID="43ede598-957f-413f-a2bb-70e1f52d25fe" containerID="f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb" exitCode=0 Apr 23 16:51:14.143725 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:14.143382 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" event={"ID":"43ede598-957f-413f-a2bb-70e1f52d25fe","Type":"ContainerDied","Data":"f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb"} Apr 23 16:51:15.150736 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:15.150701 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" event={"ID":"43ede598-957f-413f-a2bb-70e1f52d25fe","Type":"ContainerStarted","Data":"cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e"} Apr 23 16:51:15.173708 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:15.173656 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" podStartSLOduration=7.173641725 podStartE2EDuration="7.173641725s" podCreationTimestamp="2026-04-23 16:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:51:15.170056266 +0000 UTC m=+961.231130534" watchObservedRunningTime="2026-04-23 16:51:15.173641725 +0000 UTC m=+961.234715979" Apr 23 16:51:19.306777 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:19.306741 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:19.307400 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:19.306795 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:19.322107 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:19.322059 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:19.575277 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:19.575190 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:19.575277 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:19.575249 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:19.576669 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:51:19.576635 2580 logging.go:55] [core] [Channel #35 SubChannel #36]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.56:9003", ServerName: "10.132.0.56:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.56:9003: connect: connection refused" Apr 23 16:51:19.576669 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:19.576657 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.56:8082/healthz\": dial tcp 10.132.0.56:8082: connect: connection refused" Apr 23 16:51:20.188249 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:20.188218 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:20.576286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:20.576241 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.56:9003\" within 1s: context deadline exceeded" Apr 23 16:51:29.575885 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:51:29.575800 2580 logging.go:55] [core] [Channel #37 SubChannel #38]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.56:9003", ServerName: "10.132.0.56:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.56:9003: connect: connection refused" Apr 23 16:51:29.577492 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:29.577460 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:29.578722 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:29.578703 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:30.576516 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:30.576458 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.56:9003\" within 1s: context deadline exceeded" Apr 23 16:51:32.241968 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:32.241928 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" event={"ID":"e081da07-6ba7-41fa-925c-1c9264885cff","Type":"ContainerStarted","Data":"9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580"} Apr 23 16:51:32.265311 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:32.265246 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podStartSLOduration=17.745588891 podStartE2EDuration="1m4.265231209s" podCreationTimestamp="2026-04-23 16:50:28 +0000 UTC" firstStartedPulling="2026-04-23 16:50:44.980576633 +0000 UTC m=+931.041650865" lastFinishedPulling="2026-04-23 16:51:31.500218951 +0000 UTC m=+977.561293183" observedRunningTime="2026-04-23 16:51:32.264022696 +0000 UTC m=+978.325096951" watchObservedRunningTime="2026-04-23 16:51:32.265231209 +0000 UTC m=+978.326305490" Apr 23 16:51:38.473084 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:38.473039 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:51:38.473492 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:38.473095 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:51:38.474437 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:38.474411 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 23 16:51:48.473033 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:48.472986 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 23 16:51:50.232816 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:50.232783 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:51.203381 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:51.203343 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt"] Apr 23 16:51:51.204320 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:51.204253 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="main" containerID="cri-o://05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883" gracePeriod=30 Apr 23 16:51:51.204866 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:51.204676 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="tokenizer" containerID="cri-o://ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10" gracePeriod=30 Apr 23 16:51:51.206219 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:51.206187 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8"] Apr 23 16:51:51.206709 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:51.206679 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" podUID="43ede598-957f-413f-a2bb-70e1f52d25fe" containerName="main" containerID="cri-o://cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e" gracePeriod=30 Apr 23 16:51:51.982282 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:51.982255 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:52.027911 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.027868 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43ede598-957f-413f-a2bb-70e1f52d25fe-tls-certs\") pod \"43ede598-957f-413f-a2bb-70e1f52d25fe\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " Apr 23 16:51:52.027911 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.027917 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-dshm\") pod \"43ede598-957f-413f-a2bb-70e1f52d25fe\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " Apr 23 16:51:52.028160 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.027942 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-kserve-provision-location\") pod \"43ede598-957f-413f-a2bb-70e1f52d25fe\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " Apr 23 16:51:52.028160 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.028079 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-home\") pod \"43ede598-957f-413f-a2bb-70e1f52d25fe\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " Apr 23 16:51:52.028160 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.028125 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k99p7\" (UniqueName: \"kubernetes.io/projected/43ede598-957f-413f-a2bb-70e1f52d25fe-kube-api-access-k99p7\") pod \"43ede598-957f-413f-a2bb-70e1f52d25fe\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " Apr 23 16:51:52.028363 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.028216 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-model-cache\") pod \"43ede598-957f-413f-a2bb-70e1f52d25fe\" (UID: \"43ede598-957f-413f-a2bb-70e1f52d25fe\") " Apr 23 16:51:52.030675 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.030515 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ede598-957f-413f-a2bb-70e1f52d25fe-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "43ede598-957f-413f-a2bb-70e1f52d25fe" (UID: "43ede598-957f-413f-a2bb-70e1f52d25fe"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:51:52.030819 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.030751 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-model-cache" (OuterVolumeSpecName: "model-cache") pod "43ede598-957f-413f-a2bb-70e1f52d25fe" (UID: "43ede598-957f-413f-a2bb-70e1f52d25fe"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:52.035392 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.035363 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-home" (OuterVolumeSpecName: "home") pod "43ede598-957f-413f-a2bb-70e1f52d25fe" (UID: "43ede598-957f-413f-a2bb-70e1f52d25fe"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:52.041907 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.041873 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-dshm" (OuterVolumeSpecName: "dshm") pod "43ede598-957f-413f-a2bb-70e1f52d25fe" (UID: "43ede598-957f-413f-a2bb-70e1f52d25fe"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:52.043077 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.043051 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ede598-957f-413f-a2bb-70e1f52d25fe-kube-api-access-k99p7" (OuterVolumeSpecName: "kube-api-access-k99p7") pod "43ede598-957f-413f-a2bb-70e1f52d25fe" (UID: "43ede598-957f-413f-a2bb-70e1f52d25fe"). InnerVolumeSpecName "kube-api-access-k99p7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:51:52.095557 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.095465 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "43ede598-957f-413f-a2bb-70e1f52d25fe" (UID: "43ede598-957f-413f-a2bb-70e1f52d25fe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:52.130475 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.130434 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k99p7\" (UniqueName: \"kubernetes.io/projected/43ede598-957f-413f-a2bb-70e1f52d25fe-kube-api-access-k99p7\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:52.130649 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.130496 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:52.130649 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.130511 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43ede598-957f-413f-a2bb-70e1f52d25fe-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:52.130649 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.130523 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:52.130649 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.130534 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:52.130649 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.130546 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/43ede598-957f-413f-a2bb-70e1f52d25fe-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:52.336597 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.336557 2580 generic.go:358] "Generic (PLEG): container finished" podID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerID="05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883" exitCode=0 Apr 23 16:51:52.336793 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.336637 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" event={"ID":"8c2041ac-d2b9-467c-95aa-1ff25244ac2b","Type":"ContainerDied","Data":"05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883"} Apr 23 16:51:52.338253 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.338228 2580 generic.go:358] "Generic (PLEG): container finished" podID="43ede598-957f-413f-a2bb-70e1f52d25fe" containerID="cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e" exitCode=0 Apr 23 16:51:52.338405 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.338315 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" event={"ID":"43ede598-957f-413f-a2bb-70e1f52d25fe","Type":"ContainerDied","Data":"cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e"} Apr 23 16:51:52.338405 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.338342 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" event={"ID":"43ede598-957f-413f-a2bb-70e1f52d25fe","Type":"ContainerDied","Data":"a6c45c852dd6036748b80be88a03932deb73f6255a1a234394dd065d3c59ea4a"} Apr 23 16:51:52.338405 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.338361 2580 scope.go:117] "RemoveContainer" containerID="cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e" Apr 23 16:51:52.338405 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.338388 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8" Apr 23 16:51:52.349418 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.349394 2580 scope.go:117] "RemoveContainer" containerID="f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb" Apr 23 16:51:52.363223 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.363199 2580 scope.go:117] "RemoveContainer" containerID="cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e" Apr 23 16:51:52.363635 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:51:52.363608 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e\": container with ID starting with cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e not found: ID does not exist" containerID="cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e" Apr 23 16:51:52.363749 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.363647 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e"} err="failed to get container status \"cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e\": rpc error: code = NotFound desc = could not find container \"cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e\": container with ID starting with cfa42b24e6d2aeaebebf843e5d6e4308f5bc333309213ae4276890c7df8fdd2e not found: ID does not exist" Apr 23 16:51:52.363749 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.363674 2580 scope.go:117] "RemoveContainer" containerID="f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb" Apr 23 16:51:52.364020 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:51:52.363989 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb\": container with ID starting with f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb not found: ID does not exist" containerID="f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb" Apr 23 16:51:52.364140 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.364029 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb"} err="failed to get container status \"f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb\": rpc error: code = NotFound desc = could not find container \"f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb\": container with ID starting with f7c650e6441c8fa5a796ffc572cb5b3021f31cb41336021064a548ac05ec0bbb not found: ID does not exist" Apr 23 16:51:52.365722 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.365623 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8"] Apr 23 16:51:52.368051 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.368023 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-s6fk8"] Apr 23 16:51:52.538450 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.538405 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ede598-957f-413f-a2bb-70e1f52d25fe" path="/var/lib/kubelet/pods/43ede598-957f-413f-a2bb-70e1f52d25fe/volumes" Apr 23 16:51:52.810205 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.810175 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:52.937346 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.937242 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-uds\") pod \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " Apr 23 16:51:52.937346 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.937286 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-cache\") pod \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " Apr 23 16:51:52.937346 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.937323 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng6vt\" (UniqueName: \"kubernetes.io/projected/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kube-api-access-ng6vt\") pod \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " Apr 23 16:51:52.937346 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.937346 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kserve-provision-location\") pod \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " Apr 23 16:51:52.937684 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.937388 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tls-certs\") pod \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " Apr 23 16:51:52.937684 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.937417 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-tmp\") pod \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\" (UID: \"8c2041ac-d2b9-467c-95aa-1ff25244ac2b\") " Apr 23 16:51:52.937684 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.937579 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8c2041ac-d2b9-467c-95aa-1ff25244ac2b" (UID: "8c2041ac-d2b9-467c-95aa-1ff25244ac2b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:52.937871 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.937683 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-uds\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:52.937871 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.937694 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8c2041ac-d2b9-467c-95aa-1ff25244ac2b" (UID: "8c2041ac-d2b9-467c-95aa-1ff25244ac2b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:52.937973 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.937900 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8c2041ac-d2b9-467c-95aa-1ff25244ac2b" (UID: "8c2041ac-d2b9-467c-95aa-1ff25244ac2b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:52.938170 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.938145 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8c2041ac-d2b9-467c-95aa-1ff25244ac2b" (UID: "8c2041ac-d2b9-467c-95aa-1ff25244ac2b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:52.939562 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.939543 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kube-api-access-ng6vt" (OuterVolumeSpecName: "kube-api-access-ng6vt") pod "8c2041ac-d2b9-467c-95aa-1ff25244ac2b" (UID: "8c2041ac-d2b9-467c-95aa-1ff25244ac2b"). InnerVolumeSpecName "kube-api-access-ng6vt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:51:52.939641 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:52.939553 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8c2041ac-d2b9-467c-95aa-1ff25244ac2b" (UID: "8c2041ac-d2b9-467c-95aa-1ff25244ac2b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:51:53.038215 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.038178 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:53.038215 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.038210 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ng6vt\" (UniqueName: \"kubernetes.io/projected/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kube-api-access-ng6vt\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:53.038215 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.038222 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:53.038659 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.038231 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:53.038659 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.038239 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c2041ac-d2b9-467c-95aa-1ff25244ac2b-tokenizer-tmp\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:51:53.344765 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.344729 2580 generic.go:358] "Generic (PLEG): container finished" podID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerID="ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10" exitCode=0 Apr 23 16:51:53.344998 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.344809 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" Apr 23 16:51:53.344998 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.344810 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" event={"ID":"8c2041ac-d2b9-467c-95aa-1ff25244ac2b","Type":"ContainerDied","Data":"ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10"} Apr 23 16:51:53.344998 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.344848 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt" event={"ID":"8c2041ac-d2b9-467c-95aa-1ff25244ac2b","Type":"ContainerDied","Data":"feb384418243127670b48f939ac1fff2cdfb61f9920835c7cdb175864345fc86"} Apr 23 16:51:53.344998 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.344860 2580 scope.go:117] "RemoveContainer" containerID="ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10" Apr 23 16:51:53.355052 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.355031 2580 scope.go:117] "RemoveContainer" containerID="05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883" Apr 23 16:51:53.364673 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.364633 2580 scope.go:117] "RemoveContainer" containerID="b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d" Apr 23 16:51:53.369208 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.369186 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt"] Apr 23 16:51:53.375098 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.375081 2580 scope.go:117] "RemoveContainer" containerID="ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10" Apr 23 16:51:53.375375 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:51:53.375353 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10\": container with ID starting with ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10 not found: ID does not exist" containerID="ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10" Apr 23 16:51:53.375468 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.375382 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10"} err="failed to get container status \"ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10\": rpc error: code = NotFound desc = could not find container \"ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10\": container with ID starting with ec3b4e08ad445fac35cdf37640a4083fc42872e272567f30f09f8634da600d10 not found: ID does not exist" Apr 23 16:51:53.375468 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.375401 2580 scope.go:117] "RemoveContainer" containerID="05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883" Apr 23 16:51:53.375641 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:51:53.375620 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883\": container with ID starting with 05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883 not found: ID does not exist" containerID="05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883" Apr 23 16:51:53.375684 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.375652 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883"} err="failed to get container status \"05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883\": rpc error: code = NotFound desc = could not find container \"05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883\": container with ID starting with 05a00d9f22119a731eae9ca750ab772b5f3dab2562d620906c1c31694794c883 not found: ID does not exist" Apr 23 16:51:53.375684 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.375666 2580 scope.go:117] "RemoveContainer" containerID="b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d" Apr 23 16:51:53.375912 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:51:53.375891 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d\": container with ID starting with b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d not found: ID does not exist" containerID="b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d" Apr 23 16:51:53.375999 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.375913 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d"} err="failed to get container status \"b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d\": rpc error: code = NotFound desc = could not find container \"b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d\": container with ID starting with b1445dab32dbceb7df65acdf60ecf47d08ed29b05628e622ed9e9b54e99bad2d not found: ID does not exist" Apr 23 16:51:53.377401 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:53.377378 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58d7b5c66m4xt"] Apr 23 16:51:54.539091 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:54.539053 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" path="/var/lib/kubelet/pods/8c2041ac-d2b9-467c-95aa-1ff25244ac2b/volumes" Apr 23 16:51:58.156518 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.156479 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q"] Apr 23 16:51:58.157058 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157035 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43ede598-957f-413f-a2bb-70e1f52d25fe" containerName="main" Apr 23 16:51:58.157058 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157058 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ede598-957f-413f-a2bb-70e1f52d25fe" containerName="main" Apr 23 16:51:58.157274 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157075 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="main" Apr 23 16:51:58.157274 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157084 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="main" Apr 23 16:51:58.157274 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157095 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="tokenizer" Apr 23 16:51:58.157274 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157104 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="tokenizer" Apr 23 16:51:58.157274 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157117 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43ede598-957f-413f-a2bb-70e1f52d25fe" containerName="storage-initializer" Apr 23 16:51:58.157274 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157127 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ede598-957f-413f-a2bb-70e1f52d25fe" containerName="storage-initializer" Apr 23 16:51:58.157274 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157181 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="storage-initializer" Apr 23 16:51:58.157274 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157192 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="storage-initializer" Apr 23 16:51:58.157659 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157332 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="43ede598-957f-413f-a2bb-70e1f52d25fe" containerName="main" Apr 23 16:51:58.157659 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157348 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="tokenizer" Apr 23 16:51:58.157659 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.157358 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c2041ac-d2b9-467c-95aa-1ff25244ac2b" containerName="main" Apr 23 16:51:58.202433 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.202397 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q"] Apr 23 16:51:58.202597 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.202534 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.205395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.205367 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 23 16:51:58.280274 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.280235 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-dshm\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.280502 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.280310 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-model-cache\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.280502 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.280337 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kserve-provision-location\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.280502 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.280368 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrkh2\" (UniqueName: \"kubernetes.io/projected/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kube-api-access-nrkh2\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.280502 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.280392 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-home\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.280502 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.280489 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-tls-certs\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.381689 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.381649 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-model-cache\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.381884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.381697 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kserve-provision-location\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.381884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.381740 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrkh2\" (UniqueName: \"kubernetes.io/projected/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kube-api-access-nrkh2\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.381884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.381767 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-home\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.381884 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.381865 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-tls-certs\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.382115 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.381949 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-dshm\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.382171 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.382154 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kserve-provision-location\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.382227 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.382210 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-home\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.382433 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.382398 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-model-cache\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.384251 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.384231 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-dshm\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.384609 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.384588 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-tls-certs\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.391606 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.391586 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrkh2\" (UniqueName: \"kubernetes.io/projected/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kube-api-access-nrkh2\") pod \"stop-feature-test-kserve-77cc947894-z577q\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.412948 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.412868 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w"] Apr 23 16:51:58.442778 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.442744 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w"] Apr 23 16:51:58.442919 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.442857 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.445593 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.445572 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-hkwpx\"" Apr 23 16:51:58.473673 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.473633 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 23 16:51:58.482458 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.482430 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.482619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.482472 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0031f665-7313-4a25-a9df-297c7fda5a22-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.482619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.482492 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.482619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.482535 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdkt\" (UniqueName: \"kubernetes.io/projected/0031f665-7313-4a25-a9df-297c7fda5a22-kube-api-access-2zdkt\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.482619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.482565 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.482619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.482589 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.516182 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.516148 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:51:58.583482 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.583440 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.583647 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.583557 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.583647 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.583601 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0031f665-7313-4a25-a9df-297c7fda5a22-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.583647 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.583626 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.583828 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.583673 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdkt\" (UniqueName: \"kubernetes.io/projected/0031f665-7313-4a25-a9df-297c7fda5a22-kube-api-access-2zdkt\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.583828 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.583725 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.583961 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.583852 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.583961 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.583899 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.584093 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.584068 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.584173 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.584153 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.586754 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.586723 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0031f665-7313-4a25-a9df-297c7fda5a22-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.595004 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.594956 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdkt\" (UniqueName: \"kubernetes.io/projected/0031f665-7313-4a25-a9df-297c7fda5a22-kube-api-access-2zdkt\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.684947 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.684913 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q"] Apr 23 16:51:58.686937 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:51:58.686906 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60b1ccbf_0d81_48bc_8e76_3cca3eb7ad4f.slice/crio-54654088e7e2e40e136e7cc75280a3a21ee24ff616bf25d9c72ddc3615645a38 WatchSource:0}: Error finding container 54654088e7e2e40e136e7cc75280a3a21ee24ff616bf25d9c72ddc3615645a38: Status 404 returned error can't find the container with id 54654088e7e2e40e136e7cc75280a3a21ee24ff616bf25d9c72ddc3615645a38 Apr 23 16:51:58.753413 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.753384 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:51:58.898147 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:58.898049 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w"] Apr 23 16:51:58.900432 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:51:58.900394 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0031f665_7313_4a25_a9df_297c7fda5a22.slice/crio-0fa5f4d3ee94e72f926b0b3aad6ca040fd5a3cebb0e21eb708578991e05eac9e WatchSource:0}: Error finding container 0fa5f4d3ee94e72f926b0b3aad6ca040fd5a3cebb0e21eb708578991e05eac9e: Status 404 returned error can't find the container with id 0fa5f4d3ee94e72f926b0b3aad6ca040fd5a3cebb0e21eb708578991e05eac9e Apr 23 16:51:59.373740 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:59.373703 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" event={"ID":"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f","Type":"ContainerStarted","Data":"0933f7b5d606b14445437a23bac67dd9f95e894bde0bea57963a42ed2ee0cfda"} Apr 23 16:51:59.374218 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:59.373748 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" event={"ID":"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f","Type":"ContainerStarted","Data":"54654088e7e2e40e136e7cc75280a3a21ee24ff616bf25d9c72ddc3615645a38"} Apr 23 16:51:59.375404 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:59.375379 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" event={"ID":"0031f665-7313-4a25-a9df-297c7fda5a22","Type":"ContainerStarted","Data":"15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188"} Apr 23 16:51:59.375404 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:51:59.375406 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" event={"ID":"0031f665-7313-4a25-a9df-297c7fda5a22","Type":"ContainerStarted","Data":"0fa5f4d3ee94e72f926b0b3aad6ca040fd5a3cebb0e21eb708578991e05eac9e"} Apr 23 16:52:00.384989 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:00.384946 2580 generic.go:358] "Generic (PLEG): container finished" podID="0031f665-7313-4a25-a9df-297c7fda5a22" containerID="15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188" exitCode=0 Apr 23 16:52:00.385504 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:00.385084 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" event={"ID":"0031f665-7313-4a25-a9df-297c7fda5a22","Type":"ContainerDied","Data":"15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188"} Apr 23 16:52:01.395238 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:01.395198 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" event={"ID":"0031f665-7313-4a25-a9df-297c7fda5a22","Type":"ContainerStarted","Data":"c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0"} Apr 23 16:52:01.395238 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:01.395248 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" event={"ID":"0031f665-7313-4a25-a9df-297c7fda5a22","Type":"ContainerStarted","Data":"9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a"} Apr 23 16:52:01.395792 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:01.395330 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:52:01.419924 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:01.419856 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" podStartSLOduration=3.419838219 podStartE2EDuration="3.419838219s" podCreationTimestamp="2026-04-23 16:51:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:52:01.417380311 +0000 UTC m=+1007.478454577" watchObservedRunningTime="2026-04-23 16:52:01.419838219 +0000 UTC m=+1007.480912472" Apr 23 16:52:03.406804 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:03.406718 2580 generic.go:358] "Generic (PLEG): container finished" podID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerID="0933f7b5d606b14445437a23bac67dd9f95e894bde0bea57963a42ed2ee0cfda" exitCode=0 Apr 23 16:52:03.407150 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:03.406794 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" event={"ID":"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f","Type":"ContainerDied","Data":"0933f7b5d606b14445437a23bac67dd9f95e894bde0bea57963a42ed2ee0cfda"} Apr 23 16:52:04.415127 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:04.415081 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" event={"ID":"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f","Type":"ContainerStarted","Data":"64873c17b3cc12cf9456e8b36de9bc9d4fbbe6e550d8abbb6131b6583d863f58"} Apr 23 16:52:04.440684 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:04.440617 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podStartSLOduration=6.440601553 podStartE2EDuration="6.440601553s" podCreationTimestamp="2026-04-23 16:51:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:52:04.437239289 +0000 UTC m=+1010.498313546" watchObservedRunningTime="2026-04-23 16:52:04.440601553 +0000 UTC m=+1010.501675807" Apr 23 16:52:08.473626 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:08.473585 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 23 16:52:08.516593 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:08.516558 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:52:08.516593 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:08.516609 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:52:08.518218 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:08.518174 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.57:8000/health\": dial tcp 10.132.0.57:8000: connect: connection refused" Apr 23 16:52:08.753815 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:08.753713 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:52:08.753815 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:08.753774 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:52:08.756734 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:08.756706 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:52:09.438190 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:09.438159 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:52:18.473257 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:18.473209 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 23 16:52:18.517332 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:18.517277 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.57:8000/health\": dial tcp 10.132.0.57:8000: connect: connection refused" Apr 23 16:52:28.472763 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:28.472719 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 23 16:52:28.517346 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:28.517301 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.57:8000/health\": dial tcp 10.132.0.57:8000: connect: connection refused" Apr 23 16:52:30.443031 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:30.442999 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:52:38.473408 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:38.473358 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 23 16:52:38.517463 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:38.517418 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.57:8000/health\": dial tcp 10.132.0.57:8000: connect: connection refused" Apr 23 16:52:48.473414 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:48.473364 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 23 16:52:48.517243 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:48.517183 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.57:8000/health\": dial tcp 10.132.0.57:8000: connect: connection refused" Apr 23 16:52:58.473588 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:58.473491 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 23 16:52:58.517395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:52:58.517352 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.57:8000/health\": dial tcp 10.132.0.57:8000: connect: connection refused" Apr 23 16:53:08.473535 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:08.473493 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 23 16:53:08.517008 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:08.516966 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.57:8000/health\": dial tcp 10.132.0.57:8000: connect: connection refused" Apr 23 16:53:18.482994 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:18.482960 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:53:18.491592 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:18.491560 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:53:18.516990 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:18.516955 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.57:8000/health\": dial tcp 10.132.0.57:8000: connect: connection refused" Apr 23 16:53:28.145394 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:28.145354 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4"] Apr 23 16:53:28.145984 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:28.145645 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" containerID="cri-o://9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580" gracePeriod=30 Apr 23 16:53:28.516809 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:28.516759 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.57:8000/health\": dial tcp 10.132.0.57:8000: connect: connection refused" Apr 23 16:53:38.517152 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:38.517095 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.57:8000/health\": dial tcp 10.132.0.57:8000: connect: connection refused" Apr 23 16:53:39.366306 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.366248 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d"] Apr 23 16:53:39.372223 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.372193 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.375206 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.375174 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 23 16:53:39.380152 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.380126 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d"] Apr 23 16:53:39.528516 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.528481 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/132366bd-209d-482b-9ee7-1060ce431a79-tls-certs\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.528902 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.528547 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-dshm\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.528902 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.528571 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-model-cache\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.528902 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.528595 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-home\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.528902 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.528617 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.528902 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.528652 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8z62\" (UniqueName: \"kubernetes.io/projected/132366bd-209d-482b-9ee7-1060ce431a79-kube-api-access-w8z62\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.630030 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.629939 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/132366bd-209d-482b-9ee7-1060ce431a79-tls-certs\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.630030 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.630025 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-dshm\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.630303 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.630056 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-model-cache\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.630303 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.630096 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-home\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.630303 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.630249 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.630483 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.630323 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8z62\" (UniqueName: \"kubernetes.io/projected/132366bd-209d-482b-9ee7-1060ce431a79-kube-api-access-w8z62\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.630582 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.630557 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-model-cache\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.630711 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.630653 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.630865 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.630844 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-home\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.632545 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.632507 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-dshm\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.632718 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.632699 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/132366bd-209d-482b-9ee7-1060ce431a79-tls-certs\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.639567 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.639542 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8z62\" (UniqueName: \"kubernetes.io/projected/132366bd-209d-482b-9ee7-1060ce431a79-kube-api-access-w8z62\") pod \"custom-route-timeout-test-kserve-5594956b58-gtd9d\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.686180 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.686135 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:39.838808 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:39.838782 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d"] Apr 23 16:53:40.842860 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:40.842819 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" event={"ID":"132366bd-209d-482b-9ee7-1060ce431a79","Type":"ContainerStarted","Data":"102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b"} Apr 23 16:53:40.842860 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:40.842865 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" event={"ID":"132366bd-209d-482b-9ee7-1060ce431a79","Type":"ContainerStarted","Data":"b86b43b4856ccd58b1a0d7f5bf363b8184cf7437c81b429c1d3aa4dc0b3ac663"} Apr 23 16:53:44.860458 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:44.860420 2580 generic.go:358] "Generic (PLEG): container finished" podID="132366bd-209d-482b-9ee7-1060ce431a79" containerID="102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b" exitCode=0 Apr 23 16:53:44.860904 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:44.860494 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" event={"ID":"132366bd-209d-482b-9ee7-1060ce431a79","Type":"ContainerDied","Data":"102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b"} Apr 23 16:53:45.866888 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:45.866846 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" event={"ID":"132366bd-209d-482b-9ee7-1060ce431a79","Type":"ContainerStarted","Data":"e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32"} Apr 23 16:53:45.888011 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:45.887939 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" podStartSLOduration=6.887920315 podStartE2EDuration="6.887920315s" podCreationTimestamp="2026-04-23 16:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:53:45.88665956 +0000 UTC m=+1111.947733851" watchObservedRunningTime="2026-04-23 16:53:45.887920315 +0000 UTC m=+1111.948994569" Apr 23 16:53:48.516917 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:48.516858 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.57:8000/health\": dial tcp 10.132.0.57:8000: connect: connection refused" Apr 23 16:53:49.686693 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:49.686656 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:49.686693 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:49.686700 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:53:49.688304 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:49.688254 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" probeResult="failure" output="Get \"https://10.132.0.59:8000/health\": dial tcp 10.132.0.59:8000: connect: connection refused" Apr 23 16:53:58.467007 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.466976 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4_e081da07-6ba7-41fa-925c-1c9264885cff/main/0.log" Apr 23 16:53:58.467460 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.467431 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:53:58.513966 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.513937 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-home\") pod \"e081da07-6ba7-41fa-925c-1c9264885cff\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " Apr 23 16:53:58.514148 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.513982 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-kserve-provision-location\") pod \"e081da07-6ba7-41fa-925c-1c9264885cff\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " Apr 23 16:53:58.514148 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.514026 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-model-cache\") pod \"e081da07-6ba7-41fa-925c-1c9264885cff\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " Apr 23 16:53:58.514148 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.514055 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-dshm\") pod \"e081da07-6ba7-41fa-925c-1c9264885cff\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " Apr 23 16:53:58.514148 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.514075 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e081da07-6ba7-41fa-925c-1c9264885cff-tls-certs\") pod \"e081da07-6ba7-41fa-925c-1c9264885cff\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " Apr 23 16:53:58.514148 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.514128 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5zfb\" (UniqueName: \"kubernetes.io/projected/e081da07-6ba7-41fa-925c-1c9264885cff-kube-api-access-r5zfb\") pod \"e081da07-6ba7-41fa-925c-1c9264885cff\" (UID: \"e081da07-6ba7-41fa-925c-1c9264885cff\") " Apr 23 16:53:58.514457 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.514271 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-model-cache" (OuterVolumeSpecName: "model-cache") pod "e081da07-6ba7-41fa-925c-1c9264885cff" (UID: "e081da07-6ba7-41fa-925c-1c9264885cff"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:53:58.514457 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.514321 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-home" (OuterVolumeSpecName: "home") pod "e081da07-6ba7-41fa-925c-1c9264885cff" (UID: "e081da07-6ba7-41fa-925c-1c9264885cff"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:53:58.514457 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.514408 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:53:58.514457 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.514426 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:53:58.516678 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.516613 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e081da07-6ba7-41fa-925c-1c9264885cff-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e081da07-6ba7-41fa-925c-1c9264885cff" (UID: "e081da07-6ba7-41fa-925c-1c9264885cff"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:53:58.516805 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.516734 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-dshm" (OuterVolumeSpecName: "dshm") pod "e081da07-6ba7-41fa-925c-1c9264885cff" (UID: "e081da07-6ba7-41fa-925c-1c9264885cff"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:53:58.516994 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.516964 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e081da07-6ba7-41fa-925c-1c9264885cff-kube-api-access-r5zfb" (OuterVolumeSpecName: "kube-api-access-r5zfb") pod "e081da07-6ba7-41fa-925c-1c9264885cff" (UID: "e081da07-6ba7-41fa-925c-1c9264885cff"). InnerVolumeSpecName "kube-api-access-r5zfb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:53:58.526647 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.526544 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:53:58.539899 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.539879 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:53:58.562111 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.562074 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e081da07-6ba7-41fa-925c-1c9264885cff" (UID: "e081da07-6ba7-41fa-925c-1c9264885cff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:53:58.615450 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.615408 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:53:58.615450 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.615436 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e081da07-6ba7-41fa-925c-1c9264885cff-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:53:58.615450 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.615446 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e081da07-6ba7-41fa-925c-1c9264885cff-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:53:58.615450 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.615459 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5zfb\" (UniqueName: \"kubernetes.io/projected/e081da07-6ba7-41fa-925c-1c9264885cff-kube-api-access-r5zfb\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:53:58.923720 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.923689 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4_e081da07-6ba7-41fa-925c-1c9264885cff/main/0.log" Apr 23 16:53:58.924079 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.924055 2580 generic.go:358] "Generic (PLEG): container finished" podID="e081da07-6ba7-41fa-925c-1c9264885cff" containerID="9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580" exitCode=137 Apr 23 16:53:58.924176 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.924132 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" event={"ID":"e081da07-6ba7-41fa-925c-1c9264885cff","Type":"ContainerDied","Data":"9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580"} Apr 23 16:53:58.924176 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.924172 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" event={"ID":"e081da07-6ba7-41fa-925c-1c9264885cff","Type":"ContainerDied","Data":"cd33191771181f75772e820b0aecb3a3df453b881da6b097f5dbf65864b4c286"} Apr 23 16:53:58.924325 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.924188 2580 scope.go:117] "RemoveContainer" containerID="9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580" Apr 23 16:53:58.924325 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.924141 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4" Apr 23 16:53:58.944548 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.944502 2580 scope.go:117] "RemoveContainer" containerID="0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e" Apr 23 16:53:58.949182 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.949158 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4"] Apr 23 16:53:58.953149 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.953126 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c7db78c4d7t7c4"] Apr 23 16:53:58.991813 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.991785 2580 scope.go:117] "RemoveContainer" containerID="9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580" Apr 23 16:53:58.992137 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:53:58.992121 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580\": container with ID starting with 9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580 not found: ID does not exist" containerID="9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580" Apr 23 16:53:58.992194 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.992149 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580"} err="failed to get container status \"9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580\": rpc error: code = NotFound desc = could not find container \"9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580\": container with ID starting with 9e4ce6dfeaf9b0e026f9ac4ea1536ce735fe70c388e030d95ca7052a3c79d580 not found: ID does not exist" Apr 23 16:53:58.992194 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.992169 2580 scope.go:117] "RemoveContainer" containerID="0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e" Apr 23 16:53:58.992476 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:53:58.992455 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e\": container with ID starting with 0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e not found: ID does not exist" containerID="0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e" Apr 23 16:53:58.992537 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:58.992483 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e"} err="failed to get container status \"0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e\": rpc error: code = NotFound desc = could not find container \"0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e\": container with ID starting with 0ae56e413c6b42445b35ec5b580ee8cd59c93a9b811d58b6e1665319b4d62b9e not found: ID does not exist" Apr 23 16:53:59.686557 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:59.686514 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" probeResult="failure" output="Get \"https://10.132.0.59:8000/health\": dial tcp 10.132.0.59:8000: connect: connection refused" Apr 23 16:53:59.937981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:59.936901 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q"] Apr 23 16:53:59.937981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:53:59.937789 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" containerID="cri-o://64873c17b3cc12cf9456e8b36de9bc9d4fbbe6e550d8abbb6131b6583d863f58" gracePeriod=30 Apr 23 16:54:00.122237 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:00.122183 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w"] Apr 23 16:54:00.122748 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:00.122676 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" containerName="main" containerID="cri-o://9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a" gracePeriod=30 Apr 23 16:54:00.122748 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:00.122727 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" containerName="tokenizer" containerID="cri-o://c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0" gracePeriod=30 Apr 23 16:54:00.442040 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:54:00.442005 2580 logging.go:55] [core] [Channel #111 SubChannel #112]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.58:9003", ServerName: "10.132.0.58:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.58:9003: connect: connection refused" Apr 23 16:54:00.538893 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:00.538810 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" path="/var/lib/kubelet/pods/e081da07-6ba7-41fa-925c-1c9264885cff/volumes" Apr 23 16:54:00.944201 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:00.944110 2580 generic.go:358] "Generic (PLEG): container finished" podID="0031f665-7313-4a25-a9df-297c7fda5a22" containerID="9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a" exitCode=0 Apr 23 16:54:00.944591 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:00.944186 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" event={"ID":"0031f665-7313-4a25-a9df-297c7fda5a22","Type":"ContainerDied","Data":"9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a"} Apr 23 16:54:01.400667 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.400643 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:54:01.440347 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.440319 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0031f665-7313-4a25-a9df-297c7fda5a22-tls-certs\") pod \"0031f665-7313-4a25-a9df-297c7fda5a22\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " Apr 23 16:54:01.440532 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.440394 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-cache\") pod \"0031f665-7313-4a25-a9df-297c7fda5a22\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " Apr 23 16:54:01.440532 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.440423 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-uds\") pod \"0031f665-7313-4a25-a9df-297c7fda5a22\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " Apr 23 16:54:01.440532 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.440467 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-kserve-provision-location\") pod \"0031f665-7313-4a25-a9df-297c7fda5a22\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " Apr 23 16:54:01.440712 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.440542 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-tmp\") pod \"0031f665-7313-4a25-a9df-297c7fda5a22\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " Apr 23 16:54:01.440712 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.440567 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zdkt\" (UniqueName: \"kubernetes.io/projected/0031f665-7313-4a25-a9df-297c7fda5a22-kube-api-access-2zdkt\") pod \"0031f665-7313-4a25-a9df-297c7fda5a22\" (UID: \"0031f665-7313-4a25-a9df-297c7fda5a22\") " Apr 23 16:54:01.440712 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.440660 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0031f665-7313-4a25-a9df-297c7fda5a22" (UID: "0031f665-7313-4a25-a9df-297c7fda5a22"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:54:01.440712 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.440671 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0031f665-7313-4a25-a9df-297c7fda5a22" (UID: "0031f665-7313-4a25-a9df-297c7fda5a22"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:54:01.440926 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.440824 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:01.440926 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.440844 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-uds\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:01.440926 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.440903 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0031f665-7313-4a25-a9df-297c7fda5a22" (UID: "0031f665-7313-4a25-a9df-297c7fda5a22"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:54:01.441237 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.441199 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0031f665-7313-4a25-a9df-297c7fda5a22" (UID: "0031f665-7313-4a25-a9df-297c7fda5a22"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:54:01.442200 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.442157 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.58:9003\" within 1s: context deadline exceeded" Apr 23 16:54:01.442912 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.442879 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0031f665-7313-4a25-a9df-297c7fda5a22-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0031f665-7313-4a25-a9df-297c7fda5a22" (UID: "0031f665-7313-4a25-a9df-297c7fda5a22"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:54:01.443345 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.443321 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0031f665-7313-4a25-a9df-297c7fda5a22-kube-api-access-2zdkt" (OuterVolumeSpecName: "kube-api-access-2zdkt") pod "0031f665-7313-4a25-a9df-297c7fda5a22" (UID: "0031f665-7313-4a25-a9df-297c7fda5a22"). InnerVolumeSpecName "kube-api-access-2zdkt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:54:01.542394 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.542355 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-tokenizer-tmp\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:01.542394 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.542392 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2zdkt\" (UniqueName: \"kubernetes.io/projected/0031f665-7313-4a25-a9df-297c7fda5a22-kube-api-access-2zdkt\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:01.542628 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.542407 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0031f665-7313-4a25-a9df-297c7fda5a22-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:01.542628 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.542422 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0031f665-7313-4a25-a9df-297c7fda5a22-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:01.958126 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.958031 2580 generic.go:358] "Generic (PLEG): container finished" podID="0031f665-7313-4a25-a9df-297c7fda5a22" containerID="c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0" exitCode=0 Apr 23 16:54:01.958126 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.958098 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" event={"ID":"0031f665-7313-4a25-a9df-297c7fda5a22","Type":"ContainerDied","Data":"c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0"} Apr 23 16:54:01.958629 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.958137 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" event={"ID":"0031f665-7313-4a25-a9df-297c7fda5a22","Type":"ContainerDied","Data":"0fa5f4d3ee94e72f926b0b3aad6ca040fd5a3cebb0e21eb708578991e05eac9e"} Apr 23 16:54:01.958629 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.958156 2580 scope.go:117] "RemoveContainer" containerID="c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0" Apr 23 16:54:01.958629 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.958163 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w" Apr 23 16:54:01.967752 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.967735 2580 scope.go:117] "RemoveContainer" containerID="9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a" Apr 23 16:54:01.978778 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.978752 2580 scope.go:117] "RemoveContainer" containerID="15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188" Apr 23 16:54:01.981815 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.981785 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w"] Apr 23 16:54:01.986870 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.986847 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-vs94w"] Apr 23 16:54:01.988729 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.988715 2580 scope.go:117] "RemoveContainer" containerID="c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0" Apr 23 16:54:01.989015 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:54:01.988986 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0\": container with ID starting with c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0 not found: ID does not exist" containerID="c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0" Apr 23 16:54:01.989069 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.989030 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0"} err="failed to get container status \"c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0\": rpc error: code = NotFound desc = could not find container \"c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0\": container with ID starting with c78095d97fa5616698e9b406ca0d696706d13efd08bcf9cacf87a5be5e93f9e0 not found: ID does not exist" Apr 23 16:54:01.989069 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.989061 2580 scope.go:117] "RemoveContainer" containerID="9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a" Apr 23 16:54:01.989395 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:54:01.989377 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a\": container with ID starting with 9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a not found: ID does not exist" containerID="9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a" Apr 23 16:54:01.989455 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.989399 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a"} err="failed to get container status \"9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a\": rpc error: code = NotFound desc = could not find container \"9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a\": container with ID starting with 9692ecc617eb20a707655a82433d536ee9dd6b9fb257a5efbb64c201f3867d5a not found: ID does not exist" Apr 23 16:54:01.989455 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.989415 2580 scope.go:117] "RemoveContainer" containerID="15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188" Apr 23 16:54:01.989668 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:54:01.989653 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188\": container with ID starting with 15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188 not found: ID does not exist" containerID="15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188" Apr 23 16:54:01.989718 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:01.989669 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188"} err="failed to get container status \"15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188\": rpc error: code = NotFound desc = could not find container \"15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188\": container with ID starting with 15f8faf61d1eda034ff650d5b1bf44bc12249588a8ad5ebba88849633c6c8188 not found: ID does not exist" Apr 23 16:54:02.538366 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:02.538333 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" path="/var/lib/kubelet/pods/0031f665-7313-4a25-a9df-297c7fda5a22/volumes" Apr 23 16:54:09.687085 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:09.687031 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" probeResult="failure" output="Get \"https://10.132.0.59:8000/health\": dial tcp 10.132.0.59:8000: connect: connection refused" Apr 23 16:54:19.687073 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:19.687029 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" probeResult="failure" output="Get \"https://10.132.0.59:8000/health\": dial tcp 10.132.0.59:8000: connect: connection refused" Apr 23 16:54:27.667902 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.667866 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh"] Apr 23 16:54:27.668319 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668286 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" Apr 23 16:54:27.668319 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668316 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" Apr 23 16:54:27.668401 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668330 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" containerName="storage-initializer" Apr 23 16:54:27.668401 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668335 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" containerName="storage-initializer" Apr 23 16:54:27.668401 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668345 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" containerName="main" Apr 23 16:54:27.668401 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668351 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" containerName="main" Apr 23 16:54:27.668401 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668357 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" containerName="tokenizer" Apr 23 16:54:27.668401 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668364 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" containerName="tokenizer" Apr 23 16:54:27.668401 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668381 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="storage-initializer" Apr 23 16:54:27.668401 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668386 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="storage-initializer" Apr 23 16:54:27.668642 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668444 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="e081da07-6ba7-41fa-925c-1c9264885cff" containerName="main" Apr 23 16:54:27.668642 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668451 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" containerName="main" Apr 23 16:54:27.668642 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.668461 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0031f665-7313-4a25-a9df-297c7fda5a22" containerName="tokenizer" Apr 23 16:54:27.673889 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.673865 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.682005 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.681969 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh"] Apr 23 16:54:27.775971 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.775941 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-dshm\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.775971 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.775980 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-home\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.776227 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.776071 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kserve-provision-location\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.776227 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.776121 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f4v4\" (UniqueName: \"kubernetes.io/projected/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kube-api-access-2f4v4\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.776337 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.776224 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-tls-certs\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.776337 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.776283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-model-cache\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.877170 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.877135 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kserve-provision-location\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.877378 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.877190 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2f4v4\" (UniqueName: \"kubernetes.io/projected/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kube-api-access-2f4v4\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.877378 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.877239 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-tls-certs\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.877378 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.877281 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-model-cache\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.877378 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.877326 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-dshm\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.877378 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.877362 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-home\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.877665 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.877638 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kserve-provision-location\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.877728 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.877672 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-model-cache\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.877830 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.877798 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-home\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.879628 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.879597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-dshm\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.879886 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.879867 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-tls-certs\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.894152 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.894120 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f4v4\" (UniqueName: \"kubernetes.io/projected/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kube-api-access-2f4v4\") pod \"stop-feature-test-kserve-77cc947894-lftbh\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:27.986075 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:27.986038 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:28.130398 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:28.130362 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh"] Apr 23 16:54:28.131522 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:54:28.131496 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc1b2be_845f_421c_ae9c_ddcb84738a6a.slice/crio-14d729a23211c12e70c610b68947eee55e8f822895b322d86b2815ec5760e500 WatchSource:0}: Error finding container 14d729a23211c12e70c610b68947eee55e8f822895b322d86b2815ec5760e500: Status 404 returned error can't find the container with id 14d729a23211c12e70c610b68947eee55e8f822895b322d86b2815ec5760e500 Apr 23 16:54:29.091651 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:29.091603 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" event={"ID":"bcc1b2be-845f-421c-ae9c-ddcb84738a6a","Type":"ContainerStarted","Data":"f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d"} Apr 23 16:54:29.091651 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:29.091650 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" event={"ID":"bcc1b2be-845f-421c-ae9c-ddcb84738a6a","Type":"ContainerStarted","Data":"14d729a23211c12e70c610b68947eee55e8f822895b322d86b2815ec5760e500"} Apr 23 16:54:29.686844 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:29.686801 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" probeResult="failure" output="Get \"https://10.132.0.59:8000/health\": dial tcp 10.132.0.59:8000: connect: connection refused" Apr 23 16:54:30.099683 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.099616 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-77cc947894-z577q_60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f/main/0.log" Apr 23 16:54:30.100270 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.100245 2580 generic.go:358] "Generic (PLEG): container finished" podID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerID="64873c17b3cc12cf9456e8b36de9bc9d4fbbe6e550d8abbb6131b6583d863f58" exitCode=137 Apr 23 16:54:30.100403 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.100333 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" event={"ID":"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f","Type":"ContainerDied","Data":"64873c17b3cc12cf9456e8b36de9bc9d4fbbe6e550d8abbb6131b6583d863f58"} Apr 23 16:54:30.224903 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.224874 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-77cc947894-z577q_60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f/main/0.log" Apr 23 16:54:30.225286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.225269 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:54:30.300090 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.300051 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kserve-provision-location\") pod \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " Apr 23 16:54:30.300263 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.300152 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrkh2\" (UniqueName: \"kubernetes.io/projected/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kube-api-access-nrkh2\") pod \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " Apr 23 16:54:30.300263 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.300258 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-model-cache\") pod \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " Apr 23 16:54:30.300430 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.300286 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-home\") pod \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " Apr 23 16:54:30.300430 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.300361 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-tls-certs\") pod \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " Apr 23 16:54:30.300549 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.300443 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-dshm\") pod \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\" (UID: \"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f\") " Apr 23 16:54:30.301184 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.300884 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-model-cache" (OuterVolumeSpecName: "model-cache") pod "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" (UID: "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:54:30.301329 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.301223 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-home" (OuterVolumeSpecName: "home") pod "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" (UID: "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:54:30.305821 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.303764 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-dshm" (OuterVolumeSpecName: "dshm") pod "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" (UID: "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:54:30.305821 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.304538 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" (UID: "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:54:30.306529 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.305878 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kube-api-access-nrkh2" (OuterVolumeSpecName: "kube-api-access-nrkh2") pod "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" (UID: "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f"). InnerVolumeSpecName "kube-api-access-nrkh2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:54:30.339213 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.339174 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" (UID: "60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:54:30.401341 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.401281 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:30.401341 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.401336 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nrkh2\" (UniqueName: \"kubernetes.io/projected/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-kube-api-access-nrkh2\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:30.401341 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.401351 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:30.401670 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.401360 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:30.401670 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.401369 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:30.401670 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:30.401379 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:54:31.105518 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:31.105486 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-77cc947894-z577q_60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f/main/0.log" Apr 23 16:54:31.105991 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:31.105927 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" Apr 23 16:54:31.105991 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:31.105925 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q" event={"ID":"60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f","Type":"ContainerDied","Data":"54654088e7e2e40e136e7cc75280a3a21ee24ff616bf25d9c72ddc3615645a38"} Apr 23 16:54:31.105991 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:31.105978 2580 scope.go:117] "RemoveContainer" containerID="64873c17b3cc12cf9456e8b36de9bc9d4fbbe6e550d8abbb6131b6583d863f58" Apr 23 16:54:31.135657 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:31.135607 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q"] Apr 23 16:54:31.137324 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:31.137274 2580 scope.go:117] "RemoveContainer" containerID="0933f7b5d606b14445437a23bac67dd9f95e894bde0bea57963a42ed2ee0cfda" Apr 23 16:54:31.140856 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:31.140827 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-z577q"] Apr 23 16:54:32.538086 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:32.538043 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" path="/var/lib/kubelet/pods/60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f/volumes" Apr 23 16:54:33.116464 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:33.116431 2580 generic.go:358] "Generic (PLEG): container finished" podID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerID="f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d" exitCode=0 Apr 23 16:54:33.116666 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:33.116495 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" event={"ID":"bcc1b2be-845f-421c-ae9c-ddcb84738a6a","Type":"ContainerDied","Data":"f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d"} Apr 23 16:54:34.122602 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:34.122566 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" event={"ID":"bcc1b2be-845f-421c-ae9c-ddcb84738a6a","Type":"ContainerStarted","Data":"0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af"} Apr 23 16:54:34.143591 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:34.143532 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" podStartSLOduration=7.14351727 podStartE2EDuration="7.14351727s" podCreationTimestamp="2026-04-23 16:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:54:34.141264657 +0000 UTC m=+1160.202338914" watchObservedRunningTime="2026-04-23 16:54:34.14351727 +0000 UTC m=+1160.204591522" Apr 23 16:54:37.986219 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:37.986175 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:37.987108 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:37.987084 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:54:37.987386 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:37.987348 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.60:8000/health\": dial tcp 10.132.0.60:8000: connect: connection refused" Apr 23 16:54:39.687550 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:39.687504 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" probeResult="failure" output="Get \"https://10.132.0.59:8000/health\": dial tcp 10.132.0.59:8000: connect: connection refused" Apr 23 16:54:47.986969 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:47.986911 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.60:8000/health\": dial tcp 10.132.0.60:8000: connect: connection refused" Apr 23 16:54:49.687615 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:49.687558 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" probeResult="failure" output="Get \"https://10.132.0.59:8000/health\": dial tcp 10.132.0.59:8000: connect: connection refused" Apr 23 16:54:57.986899 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:57.986850 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.60:8000/health\": dial tcp 10.132.0.60:8000: connect: connection refused" Apr 23 16:54:59.687127 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:54:59.687080 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" probeResult="failure" output="Get \"https://10.132.0.59:8000/health\": dial tcp 10.132.0.59:8000: connect: connection refused" Apr 23 16:55:07.986483 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:07.986429 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.60:8000/health\": dial tcp 10.132.0.60:8000: connect: connection refused" Apr 23 16:55:09.687392 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:09.687339 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" probeResult="failure" output="Get \"https://10.132.0.59:8000/health\": dial tcp 10.132.0.59:8000: connect: connection refused" Apr 23 16:55:14.565513 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:14.565484 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:55:14.567476 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:14.567451 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 16:55:17.987092 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:17.987037 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.60:8000/health\": dial tcp 10.132.0.60:8000: connect: connection refused" Apr 23 16:55:19.696247 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:19.696215 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:55:19.703874 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:19.703850 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:55:26.275312 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:26.274821 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d"] Apr 23 16:55:26.275312 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:26.275225 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" containerID="cri-o://e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32" gracePeriod=30 Apr 23 16:55:27.986996 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:27.986948 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.60:8000/health\": dial tcp 10.132.0.60:8000: connect: connection refused" Apr 23 16:55:37.987018 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:37.986967 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.60:8000/health\": dial tcp 10.132.0.60:8000: connect: connection refused" Apr 23 16:55:40.951722 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:40.951682 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh"] Apr 23 16:55:40.952527 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:40.952505 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" Apr 23 16:55:40.952527 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:40.952528 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" Apr 23 16:55:40.952719 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:40.952552 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="storage-initializer" Apr 23 16:55:40.952719 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:40.952561 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="storage-initializer" Apr 23 16:55:40.952719 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:40.952698 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="60b1ccbf-0d81-48bc-8e76-3cca3eb7ad4f" containerName="main" Apr 23 16:55:40.956500 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:40.956473 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:40.959317 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:40.959277 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 23 16:55:40.973258 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:40.973226 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh"] Apr 23 16:55:41.062693 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.062650 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-home\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.062863 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.062714 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-tls-certs\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.062863 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.062812 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-dshm\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.062990 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.062867 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kserve-provision-location\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.062990 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.062896 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-model-cache\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.062990 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.062938 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlzln\" (UniqueName: \"kubernetes.io/projected/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kube-api-access-rlzln\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.164265 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.164232 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-dshm\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.164458 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.164305 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kserve-provision-location\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.164458 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.164334 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-model-cache\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.164458 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.164370 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlzln\" (UniqueName: \"kubernetes.io/projected/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kube-api-access-rlzln\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.164458 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.164418 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-home\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.164458 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.164453 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-tls-certs\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.164874 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.164805 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-model-cache\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.164874 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.164826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kserve-provision-location\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.165079 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.165002 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-home\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.166543 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.166510 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-dshm\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.166899 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.166879 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-tls-certs\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.174343 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.174318 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlzln\" (UniqueName: \"kubernetes.io/projected/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kube-api-access-rlzln\") pod \"router-with-refs-test-kserve-5bbfcf7894-tdvhh\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.273392 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.273352 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:41.624336 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:41.623851 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh"] Apr 23 16:55:41.624515 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:55:41.624489 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9c70a8d_65a0_4b62_a6ef_6e9c11189e65.slice/crio-7d689fbc87556511b548d77014726e0383ca6fde732174cb1e4506d5b681e3cf WatchSource:0}: Error finding container 7d689fbc87556511b548d77014726e0383ca6fde732174cb1e4506d5b681e3cf: Status 404 returned error can't find the container with id 7d689fbc87556511b548d77014726e0383ca6fde732174cb1e4506d5b681e3cf Apr 23 16:55:42.427984 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:42.427943 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" event={"ID":"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65","Type":"ContainerStarted","Data":"4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b"} Apr 23 16:55:42.428370 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:42.427992 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" event={"ID":"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65","Type":"ContainerStarted","Data":"7d689fbc87556511b548d77014726e0383ca6fde732174cb1e4506d5b681e3cf"} Apr 23 16:55:46.448065 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:46.448026 2580 generic.go:358] "Generic (PLEG): container finished" podID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerID="4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b" exitCode=0 Apr 23 16:55:46.448548 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:46.448095 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" event={"ID":"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65","Type":"ContainerDied","Data":"4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b"} Apr 23 16:55:47.454382 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:47.454328 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" event={"ID":"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65","Type":"ContainerStarted","Data":"2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b"} Apr 23 16:55:47.479118 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:47.479045 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" podStartSLOduration=7.479025983 podStartE2EDuration="7.479025983s" podCreationTimestamp="2026-04-23 16:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:55:47.474318921 +0000 UTC m=+1233.535393174" watchObservedRunningTime="2026-04-23 16:55:47.479025983 +0000 UTC m=+1233.540100237" Apr 23 16:55:47.986808 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:47.986768 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.60:8000/health\": dial tcp 10.132.0.60:8000: connect: connection refused" Apr 23 16:55:51.274228 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:51.274185 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:51.274671 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:51.274242 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:55:51.276009 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:51.275974 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" probeResult="failure" output="Get \"https://10.132.0.61:8000/health\": dial tcp 10.132.0.61:8000: connect: connection refused" Apr 23 16:55:56.774395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.774366 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-5594956b58-gtd9d_132366bd-209d-482b-9ee7-1060ce431a79/main/0.log" Apr 23 16:55:56.774815 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.774799 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:55:56.927681 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.927597 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-model-cache\") pod \"132366bd-209d-482b-9ee7-1060ce431a79\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " Apr 23 16:55:56.927681 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.927663 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-kserve-provision-location\") pod \"132366bd-209d-482b-9ee7-1060ce431a79\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " Apr 23 16:55:56.927981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.927707 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/132366bd-209d-482b-9ee7-1060ce431a79-tls-certs\") pod \"132366bd-209d-482b-9ee7-1060ce431a79\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " Apr 23 16:55:56.927981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.927748 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8z62\" (UniqueName: \"kubernetes.io/projected/132366bd-209d-482b-9ee7-1060ce431a79-kube-api-access-w8z62\") pod \"132366bd-209d-482b-9ee7-1060ce431a79\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " Apr 23 16:55:56.927981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.927793 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-home\") pod \"132366bd-209d-482b-9ee7-1060ce431a79\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " Apr 23 16:55:56.927981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.927837 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-dshm\") pod \"132366bd-209d-482b-9ee7-1060ce431a79\" (UID: \"132366bd-209d-482b-9ee7-1060ce431a79\") " Apr 23 16:55:56.927981 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.927860 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-model-cache" (OuterVolumeSpecName: "model-cache") pod "132366bd-209d-482b-9ee7-1060ce431a79" (UID: "132366bd-209d-482b-9ee7-1060ce431a79"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:55:56.928253 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.928107 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:55:56.928253 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.928115 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-home" (OuterVolumeSpecName: "home") pod "132366bd-209d-482b-9ee7-1060ce431a79" (UID: "132366bd-209d-482b-9ee7-1060ce431a79"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:55:56.930124 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.930089 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/132366bd-209d-482b-9ee7-1060ce431a79-kube-api-access-w8z62" (OuterVolumeSpecName: "kube-api-access-w8z62") pod "132366bd-209d-482b-9ee7-1060ce431a79" (UID: "132366bd-209d-482b-9ee7-1060ce431a79"). InnerVolumeSpecName "kube-api-access-w8z62". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:55:56.930391 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.930370 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-dshm" (OuterVolumeSpecName: "dshm") pod "132366bd-209d-482b-9ee7-1060ce431a79" (UID: "132366bd-209d-482b-9ee7-1060ce431a79"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:55:56.930479 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.930370 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132366bd-209d-482b-9ee7-1060ce431a79-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "132366bd-209d-482b-9ee7-1060ce431a79" (UID: "132366bd-209d-482b-9ee7-1060ce431a79"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:55:56.990598 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:56.990552 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "132366bd-209d-482b-9ee7-1060ce431a79" (UID: "132366bd-209d-482b-9ee7-1060ce431a79"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:55:57.028680 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.028644 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:55:57.028680 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.028675 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/132366bd-209d-482b-9ee7-1060ce431a79-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:55:57.028680 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.028689 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8z62\" (UniqueName: \"kubernetes.io/projected/132366bd-209d-482b-9ee7-1060ce431a79-kube-api-access-w8z62\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:55:57.028911 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.028703 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:55:57.028911 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.028715 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/132366bd-209d-482b-9ee7-1060ce431a79-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:55:57.500835 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.500804 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-5594956b58-gtd9d_132366bd-209d-482b-9ee7-1060ce431a79/main/0.log" Apr 23 16:55:57.501221 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.501188 2580 generic.go:358] "Generic (PLEG): container finished" podID="132366bd-209d-482b-9ee7-1060ce431a79" containerID="e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32" exitCode=137 Apr 23 16:55:57.501342 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.501263 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" event={"ID":"132366bd-209d-482b-9ee7-1060ce431a79","Type":"ContainerDied","Data":"e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32"} Apr 23 16:55:57.501342 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.501315 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" Apr 23 16:55:57.501342 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.501334 2580 scope.go:117] "RemoveContainer" containerID="e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32" Apr 23 16:55:57.501508 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.501322 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d" event={"ID":"132366bd-209d-482b-9ee7-1060ce431a79","Type":"ContainerDied","Data":"b86b43b4856ccd58b1a0d7f5bf363b8184cf7437c81b429c1d3aa4dc0b3ac663"} Apr 23 16:55:57.526886 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.526852 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d"] Apr 23 16:55:57.530173 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.530151 2580 scope.go:117] "RemoveContainer" containerID="102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b" Apr 23 16:55:57.532569 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.532546 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5594956b58-gtd9d"] Apr 23 16:55:57.579922 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.579898 2580 scope.go:117] "RemoveContainer" containerID="e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32" Apr 23 16:55:57.580282 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:55:57.580261 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32\": container with ID starting with e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32 not found: ID does not exist" containerID="e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32" Apr 23 16:55:57.580358 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.580310 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32"} err="failed to get container status \"e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32\": rpc error: code = NotFound desc = could not find container \"e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32\": container with ID starting with e265f5bd407a16803f65753b310725b3011ae68a512c3f5ee34a8d6d3f7a3d32 not found: ID does not exist" Apr 23 16:55:57.580358 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.580335 2580 scope.go:117] "RemoveContainer" containerID="102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b" Apr 23 16:55:57.580659 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:55:57.580638 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b\": container with ID starting with 102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b not found: ID does not exist" containerID="102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b" Apr 23 16:55:57.580659 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.580664 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b"} err="failed to get container status \"102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b\": rpc error: code = NotFound desc = could not find container \"102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b\": container with ID starting with 102c5155d9d15af30d6d86419e3e6c8c5d5bc7b32bea6424b349991e8f95fc7b not found: ID does not exist" Apr 23 16:55:57.987113 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:57.987060 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.60:8000/health\": dial tcp 10.132.0.60:8000: connect: connection refused" Apr 23 16:55:58.538193 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:55:58.538162 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="132366bd-209d-482b-9ee7-1060ce431a79" path="/var/lib/kubelet/pods/132366bd-209d-482b-9ee7-1060ce431a79/volumes" Apr 23 16:56:01.274407 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:01.274358 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" probeResult="failure" output="Get \"https://10.132.0.61:8000/health\": dial tcp 10.132.0.61:8000: connect: connection refused" Apr 23 16:56:07.996653 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:07.996617 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:56:08.004365 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:08.004335 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:56:09.327761 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:09.327720 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh"] Apr 23 16:56:09.554557 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:09.554373 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" containerID="cri-o://0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af" gracePeriod=30 Apr 23 16:56:11.274354 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:11.274275 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" probeResult="failure" output="Get \"https://10.132.0.61:8000/health\": dial tcp 10.132.0.61:8000: connect: connection refused" Apr 23 16:56:21.274521 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:21.274477 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" probeResult="failure" output="Get \"https://10.132.0.61:8000/health\": dial tcp 10.132.0.61:8000: connect: connection refused" Apr 23 16:56:31.274154 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:31.274111 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" probeResult="failure" output="Get \"https://10.132.0.61:8000/health\": dial tcp 10.132.0.61:8000: connect: connection refused" Apr 23 16:56:39.973164 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:39.973132 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-77cc947894-lftbh_bcc1b2be-845f-421c-ae9c-ddcb84738a6a/main/0.log" Apr 23 16:56:39.973586 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:39.973569 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:56:40.112050 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.111947 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f4v4\" (UniqueName: \"kubernetes.io/projected/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kube-api-access-2f4v4\") pod \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " Apr 23 16:56:40.112050 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.112010 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-home\") pod \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " Apr 23 16:56:40.112050 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.112043 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-model-cache\") pod \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " Apr 23 16:56:40.112395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.112153 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-dshm\") pod \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " Apr 23 16:56:40.112395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.112197 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-tls-certs\") pod \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " Apr 23 16:56:40.112395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.112253 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kserve-provision-location\") pod \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\" (UID: \"bcc1b2be-845f-421c-ae9c-ddcb84738a6a\") " Apr 23 16:56:40.112395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.112260 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-model-cache" (OuterVolumeSpecName: "model-cache") pod "bcc1b2be-845f-421c-ae9c-ddcb84738a6a" (UID: "bcc1b2be-845f-421c-ae9c-ddcb84738a6a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:56:40.112626 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.112443 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-home" (OuterVolumeSpecName: "home") pod "bcc1b2be-845f-421c-ae9c-ddcb84738a6a" (UID: "bcc1b2be-845f-421c-ae9c-ddcb84738a6a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:56:40.112626 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.112586 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:56:40.112626 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.112604 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:56:40.114694 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.114647 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kube-api-access-2f4v4" (OuterVolumeSpecName: "kube-api-access-2f4v4") pod "bcc1b2be-845f-421c-ae9c-ddcb84738a6a" (UID: "bcc1b2be-845f-421c-ae9c-ddcb84738a6a"). InnerVolumeSpecName "kube-api-access-2f4v4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:56:40.114816 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.114717 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-dshm" (OuterVolumeSpecName: "dshm") pod "bcc1b2be-845f-421c-ae9c-ddcb84738a6a" (UID: "bcc1b2be-845f-421c-ae9c-ddcb84738a6a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:56:40.114816 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.114721 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bcc1b2be-845f-421c-ae9c-ddcb84738a6a" (UID: "bcc1b2be-845f-421c-ae9c-ddcb84738a6a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:56:40.151842 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.151802 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bcc1b2be-845f-421c-ae9c-ddcb84738a6a" (UID: "bcc1b2be-845f-421c-ae9c-ddcb84738a6a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:56:40.213464 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.213426 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:56:40.213464 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.213455 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:56:40.213464 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.213467 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:56:40.213775 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.213476 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2f4v4\" (UniqueName: \"kubernetes.io/projected/bcc1b2be-845f-421c-ae9c-ddcb84738a6a-kube-api-access-2f4v4\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:56:40.685197 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.685168 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-77cc947894-lftbh_bcc1b2be-845f-421c-ae9c-ddcb84738a6a/main/0.log" Apr 23 16:56:40.685597 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.685570 2580 generic.go:358] "Generic (PLEG): container finished" podID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerID="0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af" exitCode=137 Apr 23 16:56:40.685688 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.685660 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" event={"ID":"bcc1b2be-845f-421c-ae9c-ddcb84738a6a","Type":"ContainerDied","Data":"0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af"} Apr 23 16:56:40.685735 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.685673 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" Apr 23 16:56:40.685735 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.685709 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh" event={"ID":"bcc1b2be-845f-421c-ae9c-ddcb84738a6a","Type":"ContainerDied","Data":"14d729a23211c12e70c610b68947eee55e8f822895b322d86b2815ec5760e500"} Apr 23 16:56:40.685735 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.685731 2580 scope.go:117] "RemoveContainer" containerID="0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af" Apr 23 16:56:40.708281 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.708184 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh"] Apr 23 16:56:40.710771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.710743 2580 scope.go:117] "RemoveContainer" containerID="f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d" Apr 23 16:56:40.711911 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.711890 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-77cc947894-lftbh"] Apr 23 16:56:40.771240 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.771217 2580 scope.go:117] "RemoveContainer" containerID="0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af" Apr 23 16:56:40.771569 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:56:40.771546 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af\": container with ID starting with 0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af not found: ID does not exist" containerID="0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af" Apr 23 16:56:40.771655 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.771583 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af"} err="failed to get container status \"0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af\": rpc error: code = NotFound desc = could not find container \"0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af\": container with ID starting with 0b169a7723e70b51be25ddf25cc2383a8c81a594fb06fd8c51374869476272af not found: ID does not exist" Apr 23 16:56:40.771655 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.771610 2580 scope.go:117] "RemoveContainer" containerID="f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d" Apr 23 16:56:40.771898 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:56:40.771879 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d\": container with ID starting with f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d not found: ID does not exist" containerID="f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d" Apr 23 16:56:40.771943 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:40.771905 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d"} err="failed to get container status \"f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d\": rpc error: code = NotFound desc = could not find container \"f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d\": container with ID starting with f5ff764afd4149f33f7b4dba90dbc25a86bea7cf4bca96df0bdf5c2023efe53d not found: ID does not exist" Apr 23 16:56:41.274732 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:41.274694 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" probeResult="failure" output="Get \"https://10.132.0.61:8000/health\": dial tcp 10.132.0.61:8000: connect: connection refused" Apr 23 16:56:42.540192 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:42.540155 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" path="/var/lib/kubelet/pods/bcc1b2be-845f-421c-ae9c-ddcb84738a6a/volumes" Apr 23 16:56:47.324453 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.324415 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5"] Apr 23 16:56:47.324825 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.324744 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" podUID="dfe094ff-b04d-4c9e-9dcc-cfe7279e071f" containerName="manager" containerID="cri-o://ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe" gracePeriod=30 Apr 23 16:56:47.588846 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.588775 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:56:47.680421 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.680393 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-cert\") pod \"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f\" (UID: \"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f\") " Apr 23 16:56:47.680600 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.680524 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h572h\" (UniqueName: \"kubernetes.io/projected/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-kube-api-access-h572h\") pod \"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f\" (UID: \"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f\") " Apr 23 16:56:47.682710 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.682669 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-kube-api-access-h572h" (OuterVolumeSpecName: "kube-api-access-h572h") pod "dfe094ff-b04d-4c9e-9dcc-cfe7279e071f" (UID: "dfe094ff-b04d-4c9e-9dcc-cfe7279e071f"). InnerVolumeSpecName "kube-api-access-h572h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:56:47.682828 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.682806 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-cert" (OuterVolumeSpecName: "cert") pod "dfe094ff-b04d-4c9e-9dcc-cfe7279e071f" (UID: "dfe094ff-b04d-4c9e-9dcc-cfe7279e071f"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:56:47.717735 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.717705 2580 generic.go:358] "Generic (PLEG): container finished" podID="dfe094ff-b04d-4c9e-9dcc-cfe7279e071f" containerID="ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe" exitCode=0 Apr 23 16:56:47.717897 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.717766 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" Apr 23 16:56:47.717897 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.717782 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" event={"ID":"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f","Type":"ContainerDied","Data":"ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe"} Apr 23 16:56:47.717897 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.717824 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5" event={"ID":"dfe094ff-b04d-4c9e-9dcc-cfe7279e071f","Type":"ContainerDied","Data":"1f1d6316202e29db5ce7654b8dca6a134d981da15a99f04865576d942df06449"} Apr 23 16:56:47.717897 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.717853 2580 scope.go:117] "RemoveContainer" containerID="ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe" Apr 23 16:56:47.729187 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.729170 2580 scope.go:117] "RemoveContainer" containerID="ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe" Apr 23 16:56:47.729520 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:56:47.729498 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe\": container with ID starting with ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe not found: ID does not exist" containerID="ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe" Apr 23 16:56:47.729582 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.729528 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe"} err="failed to get container status \"ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe\": rpc error: code = NotFound desc = could not find container \"ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe\": container with ID starting with ddba0076fc1e9e1ed3448153c6f113004f17aeb372e3b24c61e0893bc47b25fe not found: ID does not exist" Apr 23 16:56:47.739573 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.739548 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5"] Apr 23 16:56:47.742275 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.742249 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-6cf46f7d78-p6tn5"] Apr 23 16:56:47.781619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.781589 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h572h\" (UniqueName: \"kubernetes.io/projected/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-kube-api-access-h572h\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:56:47.781619 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:47.781616 2580 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f-cert\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:56:48.537701 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:48.537663 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe094ff-b04d-4c9e-9dcc-cfe7279e071f" path="/var/lib/kubelet/pods/dfe094ff-b04d-4c9e-9dcc-cfe7279e071f/volumes" Apr 23 16:56:51.274123 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:56:51.274071 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" probeResult="failure" output="Get \"https://10.132.0.61:8000/health\": dial tcp 10.132.0.61:8000: connect: connection refused" Apr 23 16:57:01.274056 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:01.274006 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" probeResult="failure" output="Get \"https://10.132.0.61:8000/health\": dial tcp 10.132.0.61:8000: connect: connection refused" Apr 23 16:57:11.274383 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:11.274336 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" probeResult="failure" output="Get \"https://10.132.0.61:8000/health\": dial tcp 10.132.0.61:8000: connect: connection refused" Apr 23 16:57:21.283520 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:21.283486 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:57:21.291197 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:21.291168 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:57:25.297037 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297002 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c"] Apr 23 16:57:25.297515 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297476 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="storage-initializer" Apr 23 16:57:25.297568 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297518 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="storage-initializer" Apr 23 16:57:25.297568 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297531 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfe094ff-b04d-4c9e-9dcc-cfe7279e071f" containerName="manager" Apr 23 16:57:25.297568 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297538 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe094ff-b04d-4c9e-9dcc-cfe7279e071f" containerName="manager" Apr 23 16:57:25.297568 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297550 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" Apr 23 16:57:25.297568 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297556 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" Apr 23 16:57:25.297568 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297562 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="storage-initializer" Apr 23 16:57:25.297568 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297567 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="storage-initializer" Apr 23 16:57:25.297806 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297583 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" Apr 23 16:57:25.297806 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297589 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" Apr 23 16:57:25.297806 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297652 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfe094ff-b04d-4c9e-9dcc-cfe7279e071f" containerName="manager" Apr 23 16:57:25.297806 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297662 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="132366bd-209d-482b-9ee7-1060ce431a79" containerName="main" Apr 23 16:57:25.297806 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.297667 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="bcc1b2be-845f-421c-ae9c-ddcb84738a6a" containerName="main" Apr 23 16:57:25.302991 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.302971 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.305868 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.305845 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 23 16:57:25.310223 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.310197 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c"] Apr 23 16:57:25.407429 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.407392 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.407634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.407437 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.407634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.407468 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhktr\" (UniqueName: \"kubernetes.io/projected/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kube-api-access-rhktr\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.407634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.407545 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.407634 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.407606 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.407810 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.407660 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.509137 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.509098 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.509137 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.509137 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.509386 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.509169 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.509386 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.509199 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.509386 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.509221 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhktr\" (UniqueName: \"kubernetes.io/projected/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kube-api-access-rhktr\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.509386 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.509353 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.509611 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.509585 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.509611 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.509602 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.509732 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.509716 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.511718 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.511690 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.511906 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.511886 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.520161 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.520136 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhktr\" (UniqueName: \"kubernetes.io/projected/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kube-api-access-rhktr\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.616074 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.615981 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:25.750119 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.750084 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c"] Apr 23 16:57:25.753181 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:57:25.753157 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dcbcbff_ab49_4b7d_86d4_6eeecc35fd8e.slice/crio-b0af2b4295ff04f5338d8d07cc48d13da76a9e64d36fa8eda770e9a678a6046a WatchSource:0}: Error finding container b0af2b4295ff04f5338d8d07cc48d13da76a9e64d36fa8eda770e9a678a6046a: Status 404 returned error can't find the container with id b0af2b4295ff04f5338d8d07cc48d13da76a9e64d36fa8eda770e9a678a6046a Apr 23 16:57:25.754960 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.754944 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:57:25.870482 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.870389 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" event={"ID":"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e","Type":"ContainerStarted","Data":"0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0"} Apr 23 16:57:25.870482 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:25.870435 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" event={"ID":"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e","Type":"ContainerStarted","Data":"b0af2b4295ff04f5338d8d07cc48d13da76a9e64d36fa8eda770e9a678a6046a"} Apr 23 16:57:29.888508 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:29.888471 2580 generic.go:358] "Generic (PLEG): container finished" podID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerID="0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0" exitCode=0 Apr 23 16:57:29.888877 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:29.888547 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" event={"ID":"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e","Type":"ContainerDied","Data":"0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0"} Apr 23 16:57:30.896636 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:30.896594 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" event={"ID":"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e","Type":"ContainerStarted","Data":"39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4"} Apr 23 16:57:30.924463 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:30.924385 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podStartSLOduration=5.924360627 podStartE2EDuration="5.924360627s" podCreationTimestamp="2026-04-23 16:57:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:57:30.921148542 +0000 UTC m=+1336.982222809" watchObservedRunningTime="2026-04-23 16:57:30.924360627 +0000 UTC m=+1336.985434916" Apr 23 16:57:35.616425 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:35.616385 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:35.616425 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:35.616424 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:57:35.618161 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:35.618131 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:57:45.614012 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:45.613976 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh"] Apr 23 16:57:45.614418 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:45.614364 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" containerID="cri-o://2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b" gracePeriod=30 Apr 23 16:57:45.617335 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:45.617279 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:57:50.304149 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.304104 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft"] Apr 23 16:57:50.309618 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.309596 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.312463 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.312436 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-gkzdh\"" Apr 23 16:57:50.312463 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.312462 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 23 16:57:50.318113 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.317914 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft"] Apr 23 16:57:50.338312 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.337424 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx"] Apr 23 16:57:50.344874 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.344490 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.345145 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.345125 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/68b434c2-61f8-4afa-9a95-3443e031bff5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.345356 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.345329 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb6xg\" (UniqueName: \"kubernetes.io/projected/68b434c2-61f8-4afa-9a95-3443e031bff5-kube-api-access-nb6xg\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.345436 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.345403 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.345500 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.345461 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.345559 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.345519 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.348035 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.347992 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.355527 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.355503 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx"] Apr 23 16:57:50.446386 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446345 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.446578 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446406 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/68b434c2-61f8-4afa-9a95-3443e031bff5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.446578 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446439 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.446578 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446482 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nb6xg\" (UniqueName: \"kubernetes.io/projected/68b434c2-61f8-4afa-9a95-3443e031bff5-kube-api-access-nb6xg\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.446578 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446512 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a6a10f65-efca-46fc-aa37-3712b44c85bb-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.446578 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446540 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.446578 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446565 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv7fp\" (UniqueName: \"kubernetes.io/projected/a6a10f65-efca-46fc-aa37-3712b44c85bb-kube-api-access-nv7fp\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.446928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446604 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.446928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446643 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.446928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446672 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.446928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446728 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.446928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446774 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.446928 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446849 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.447266 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.446924 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.447266 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.447029 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.448860 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.448837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.449131 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.449110 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/68b434c2-61f8-4afa-9a95-3443e031bff5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.455169 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.455146 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb6xg\" (UniqueName: \"kubernetes.io/projected/68b434c2-61f8-4afa-9a95-3443e031bff5-kube-api-access-nb6xg\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.547443 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.547408 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.547640 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.547462 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a6a10f65-efca-46fc-aa37-3712b44c85bb-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.547640 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.547485 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.547640 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.547511 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv7fp\" (UniqueName: \"kubernetes.io/projected/a6a10f65-efca-46fc-aa37-3712b44c85bb-kube-api-access-nv7fp\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.547640 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.547554 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.547640 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.547582 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.547938 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.547871 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.548076 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.548035 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.548395 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.548351 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.550513 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.550470 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a6a10f65-efca-46fc-aa37-3712b44c85bb-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.550714 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.550695 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.562824 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.562756 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv7fp\" (UniqueName: \"kubernetes.io/projected/a6a10f65-efca-46fc-aa37-3712b44c85bb-kube-api-access-nv7fp\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.622235 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.622200 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:50.663307 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.663257 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:57:50.771895 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.771841 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft"] Apr 23 16:57:50.772339 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:57:50.772284 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68b434c2_61f8_4afa_9a95_3443e031bff5.slice/crio-dc635217d33aa074029c63900d61c52c72f41ccae7145d8abdbed7cf0bba89c1 WatchSource:0}: Error finding container dc635217d33aa074029c63900d61c52c72f41ccae7145d8abdbed7cf0bba89c1: Status 404 returned error can't find the container with id dc635217d33aa074029c63900d61c52c72f41ccae7145d8abdbed7cf0bba89c1 Apr 23 16:57:50.813629 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.813601 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx"] Apr 23 16:57:50.816513 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:57:50.816483 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6a10f65_efca_46fc_aa37_3712b44c85bb.slice/crio-d1f1112eac8b009e0c99f6b16a77c3e19c1e29d546ac4713decf0435275d3e38 WatchSource:0}: Error finding container d1f1112eac8b009e0c99f6b16a77c3e19c1e29d546ac4713decf0435275d3e38: Status 404 returned error can't find the container with id d1f1112eac8b009e0c99f6b16a77c3e19c1e29d546ac4713decf0435275d3e38 Apr 23 16:57:50.983869 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.983826 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" event={"ID":"68b434c2-61f8-4afa-9a95-3443e031bff5","Type":"ContainerStarted","Data":"dc635217d33aa074029c63900d61c52c72f41ccae7145d8abdbed7cf0bba89c1"} Apr 23 16:57:50.985339 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.985307 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" event={"ID":"a6a10f65-efca-46fc-aa37-3712b44c85bb","Type":"ContainerStarted","Data":"5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02"} Apr 23 16:57:50.985456 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:50.985348 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" event={"ID":"a6a10f65-efca-46fc-aa37-3712b44c85bb","Type":"ContainerStarted","Data":"d1f1112eac8b009e0c99f6b16a77c3e19c1e29d546ac4713decf0435275d3e38"} Apr 23 16:57:51.992797 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:51.992730 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" event={"ID":"68b434c2-61f8-4afa-9a95-3443e031bff5","Type":"ContainerStarted","Data":"f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc"} Apr 23 16:57:51.993420 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:51.993161 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:57:52.999829 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:52.999730 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" event={"ID":"68b434c2-61f8-4afa-9a95-3443e031bff5","Type":"ContainerStarted","Data":"5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453"} Apr 23 16:57:55.616796 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:55.616753 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:57:56.016082 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:56.016044 2580 generic.go:358] "Generic (PLEG): container finished" podID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerID="5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02" exitCode=0 Apr 23 16:57:56.016282 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:56.016122 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" event={"ID":"a6a10f65-efca-46fc-aa37-3712b44c85bb","Type":"ContainerDied","Data":"5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02"} Apr 23 16:57:57.026262 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:57.026229 2580 generic.go:358] "Generic (PLEG): container finished" podID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerID="5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453" exitCode=0 Apr 23 16:57:57.026794 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:57.026328 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" event={"ID":"68b434c2-61f8-4afa-9a95-3443e031bff5","Type":"ContainerDied","Data":"5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453"} Apr 23 16:57:57.028280 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:57.028254 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" event={"ID":"a6a10f65-efca-46fc-aa37-3712b44c85bb","Type":"ContainerStarted","Data":"54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a"} Apr 23 16:57:58.036803 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:58.036762 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" event={"ID":"68b434c2-61f8-4afa-9a95-3443e031bff5","Type":"ContainerStarted","Data":"6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3"} Apr 23 16:57:58.061241 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:58.061175 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podStartSLOduration=7.210520774 podStartE2EDuration="8.061156619s" podCreationTimestamp="2026-04-23 16:57:50 +0000 UTC" firstStartedPulling="2026-04-23 16:57:50.774898489 +0000 UTC m=+1356.835972729" lastFinishedPulling="2026-04-23 16:57:51.625534326 +0000 UTC m=+1357.686608574" observedRunningTime="2026-04-23 16:57:58.057462188 +0000 UTC m=+1364.118536462" watchObservedRunningTime="2026-04-23 16:57:58.061156619 +0000 UTC m=+1364.122230874" Apr 23 16:57:58.061418 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:57:58.061357 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podStartSLOduration=8.061345233 podStartE2EDuration="8.061345233s" podCreationTimestamp="2026-04-23 16:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:57:57.08761258 +0000 UTC m=+1363.148686841" watchObservedRunningTime="2026-04-23 16:57:58.061345233 +0000 UTC m=+1364.122419514" Apr 23 16:58:00.623270 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:00.623231 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:58:00.623761 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:00.623428 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:58:00.625116 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:00.625079 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:58:00.664015 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:00.663972 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:58:00.664202 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:00.664131 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 16:58:00.665685 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:00.665652 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:58:01.070562 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:01.070529 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 16:58:05.616576 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:05.616523 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:58:10.623405 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:10.623355 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:58:10.664142 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:10.664090 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:58:15.617104 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:15.617052 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:58:15.934002 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:15.933965 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-5bbfcf7894-tdvhh_c9c70a8d-65a0-4b62-a6ef-6e9c11189e65/main/0.log" Apr 23 16:58:15.934477 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:15.934457 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:58:16.102575 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.102531 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-dshm\") pod \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " Apr 23 16:58:16.102778 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.102621 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-model-cache\") pod \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " Apr 23 16:58:16.102778 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.102660 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlzln\" (UniqueName: \"kubernetes.io/projected/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kube-api-access-rlzln\") pod \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " Apr 23 16:58:16.102778 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.102718 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-tls-certs\") pod \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " Apr 23 16:58:16.102778 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.102744 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-home\") pod \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " Apr 23 16:58:16.102778 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.102774 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kserve-provision-location\") pod \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\" (UID: \"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65\") " Apr 23 16:58:16.103042 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.102895 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-model-cache" (OuterVolumeSpecName: "model-cache") pod "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" (UID: "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:58:16.103101 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.103079 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-home" (OuterVolumeSpecName: "home") pod "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" (UID: "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:58:16.103160 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.103090 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:58:16.105510 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.105476 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-dshm" (OuterVolumeSpecName: "dshm") pod "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" (UID: "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:58:16.105660 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.105511 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" (UID: "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:58:16.105908 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.105873 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kube-api-access-rlzln" (OuterVolumeSpecName: "kube-api-access-rlzln") pod "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" (UID: "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65"). InnerVolumeSpecName "kube-api-access-rlzln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:58:16.139001 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.138962 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-5bbfcf7894-tdvhh_c9c70a8d-65a0-4b62-a6ef-6e9c11189e65/main/0.log" Apr 23 16:58:16.139419 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.139390 2580 generic.go:358] "Generic (PLEG): container finished" podID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerID="2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b" exitCode=137 Apr 23 16:58:16.139528 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.139478 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" Apr 23 16:58:16.139528 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.139486 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" event={"ID":"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65","Type":"ContainerDied","Data":"2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b"} Apr 23 16:58:16.139656 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.139533 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh" event={"ID":"c9c70a8d-65a0-4b62-a6ef-6e9c11189e65","Type":"ContainerDied","Data":"7d689fbc87556511b548d77014726e0383ca6fde732174cb1e4506d5b681e3cf"} Apr 23 16:58:16.139656 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.139555 2580 scope.go:117] "RemoveContainer" containerID="2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b" Apr 23 16:58:16.165692 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.165590 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" (UID: "c9c70a8d-65a0-4b62-a6ef-6e9c11189e65"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:58:16.177177 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.177149 2580 scope.go:117] "RemoveContainer" containerID="4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b" Apr 23 16:58:16.204011 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.203963 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:58:16.204011 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.204004 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:58:16.204246 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.204021 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:58:16.204246 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.204036 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:58:16.204246 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.204053 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rlzln\" (UniqueName: \"kubernetes.io/projected/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65-kube-api-access-rlzln\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 16:58:16.271745 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.271716 2580 scope.go:117] "RemoveContainer" containerID="2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b" Apr 23 16:58:16.272111 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:58:16.272085 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b\": container with ID starting with 2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b not found: ID does not exist" containerID="2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b" Apr 23 16:58:16.272223 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.272126 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b"} err="failed to get container status \"2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b\": rpc error: code = NotFound desc = could not find container \"2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b\": container with ID starting with 2630c44af3f239879f75e33d7814ac3e7f4948fa2bcf29664cb49ceaccf12c2b not found: ID does not exist" Apr 23 16:58:16.272223 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.272146 2580 scope.go:117] "RemoveContainer" containerID="4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b" Apr 23 16:58:16.272470 ip-10-0-128-198 kubenswrapper[2580]: E0423 16:58:16.272439 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b\": container with ID starting with 4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b not found: ID does not exist" containerID="4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b" Apr 23 16:58:16.272546 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.272481 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b"} err="failed to get container status \"4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b\": rpc error: code = NotFound desc = could not find container \"4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b\": container with ID starting with 4e4fbf63b5cb23f099fe5b1f52d1d61c7befcbb3187057c4986bd07d8bcb9d2b not found: ID does not exist" Apr 23 16:58:16.468264 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.468228 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh"] Apr 23 16:58:16.472692 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.472663 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5bbfcf7894-tdvhh"] Apr 23 16:58:16.540074 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:16.540039 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" path="/var/lib/kubelet/pods/c9c70a8d-65a0-4b62-a6ef-6e9c11189e65/volumes" Apr 23 16:58:20.623674 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:20.623622 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:58:20.664515 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:20.664458 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:58:25.616885 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:25.616837 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:58:30.622706 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:30.622659 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:58:30.663788 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:30.663735 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:58:35.617086 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:35.617037 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:58:40.623694 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:40.623638 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:58:40.664595 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:40.664544 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:58:45.617026 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:45.616978 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:58:50.623286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:50.623233 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:58:50.664743 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:50.664694 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:58:55.616833 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:58:55.616786 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:59:00.623486 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:00.623385 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:59:00.664523 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:00.664482 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:59:05.617218 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:05.617175 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:59:10.623109 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:10.623060 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:59:10.663932 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:10.663887 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:59:15.616833 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:15.616789 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:59:20.623494 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:20.623434 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:59:20.664261 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:20.664220 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:59:25.617349 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:25.617306 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.62:8000/health\": dial tcp 10.132.0.62:8000: connect: connection refused" Apr 23 16:59:30.622831 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:30.622784 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:59:30.664026 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:30.663967 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:59:35.626583 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:35.626542 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:59:35.634439 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:35.634412 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 16:59:40.623424 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:40.623374 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:59:40.663953 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:40.663907 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:59:47.101591 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:47.101553 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c"] Apr 23 16:59:47.102157 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:47.101924 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" containerID="cri-o://39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4" gracePeriod=30 Apr 23 16:59:50.622700 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:50.622658 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 16:59:50.663755 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:50.663709 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 16:59:58.163286 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.163251 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 16:59:58.163722 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.163705 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="storage-initializer" Apr 23 16:59:58.163771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.163725 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="storage-initializer" Apr 23 16:59:58.163771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.163745 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" Apr 23 16:59:58.163771 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.163750 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" Apr 23 16:59:58.163871 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.163821 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9c70a8d-65a0-4b62-a6ef-6e9c11189e65" containerName="main" Apr 23 16:59:58.168040 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.168022 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.170914 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.170887 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-fr79q\"" Apr 23 16:59:58.172047 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.172030 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 23 16:59:58.177433 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.177407 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 16:59:58.260346 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.260308 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.260346 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.260347 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.260630 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.260478 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.260630 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.260543 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t986\" (UniqueName: \"kubernetes.io/projected/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kube-api-access-7t986\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.260630 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.260584 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.260630 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.260614 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c831cb87-8c1e-44b9-946a-d7c02d40b92e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.361849 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.361803 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.362026 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.361864 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7t986\" (UniqueName: \"kubernetes.io/projected/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kube-api-access-7t986\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.362026 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.361901 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.362026 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.361919 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c831cb87-8c1e-44b9-946a-d7c02d40b92e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.362026 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.361958 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.362026 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.361982 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.362351 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.362330 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.362470 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.362388 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.362470 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.362417 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.364488 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.364457 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.364611 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.364523 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c831cb87-8c1e-44b9-946a-d7c02d40b92e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.369618 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.369594 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t986\" (UniqueName: \"kubernetes.io/projected/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kube-api-access-7t986\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.480516 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.480474 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 16:59:58.618743 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:58.618713 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 16:59:58.620681 ip-10-0-128-198 kubenswrapper[2580]: W0423 16:59:58.620654 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc831cb87_8c1e_44b9_946a_d7c02d40b92e.slice/crio-ff9a4c6112ba1c6b613ad2605c897dd67eeeae7c0e6879386ea813279f8fa4b1 WatchSource:0}: Error finding container ff9a4c6112ba1c6b613ad2605c897dd67eeeae7c0e6879386ea813279f8fa4b1: Status 404 returned error can't find the container with id ff9a4c6112ba1c6b613ad2605c897dd67eeeae7c0e6879386ea813279f8fa4b1 Apr 23 16:59:59.600503 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:59.600462 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c831cb87-8c1e-44b9-946a-d7c02d40b92e","Type":"ContainerStarted","Data":"0d39b759f07b8919d07bb31a95f96645abd07c948624dd3598486ab0123e1785"} Apr 23 16:59:59.600503 ip-10-0-128-198 kubenswrapper[2580]: I0423 16:59:59.600504 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c831cb87-8c1e-44b9-946a-d7c02d40b92e","Type":"ContainerStarted","Data":"ff9a4c6112ba1c6b613ad2605c897dd67eeeae7c0e6879386ea813279f8fa4b1"} Apr 23 17:00:00.623075 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:00.623030 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 17:00:00.664633 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:00.664584 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 17:00:03.616144 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:03.616107 2580 generic.go:358] "Generic (PLEG): container finished" podID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerID="0d39b759f07b8919d07bb31a95f96645abd07c948624dd3598486ab0123e1785" exitCode=0 Apr 23 17:00:03.616637 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:03.616179 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c831cb87-8c1e-44b9-946a-d7c02d40b92e","Type":"ContainerDied","Data":"0d39b759f07b8919d07bb31a95f96645abd07c948624dd3598486ab0123e1785"} Apr 23 17:00:04.622180 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:04.622143 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c831cb87-8c1e-44b9-946a-d7c02d40b92e","Type":"ContainerStarted","Data":"1ed0aec055f133b80b6b4512e8bc019e05e98b409b555e35912c32eaa138684d"} Apr 23 17:00:04.642178 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:04.642118 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.642098927 podStartE2EDuration="6.642098927s" podCreationTimestamp="2026-04-23 16:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:00:04.641505068 +0000 UTC m=+1490.702579332" watchObservedRunningTime="2026-04-23 17:00:04.642098927 +0000 UTC m=+1490.703173184" Apr 23 17:00:08.480640 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:08.480595 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 17:00:08.482567 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:08.482531 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.65:8000/health\": dial tcp 10.132.0.65:8000: connect: connection refused" Apr 23 17:00:10.623204 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:10.623143 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 17:00:10.664120 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:10.664069 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 17:00:14.606924 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:14.606891 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 17:00:14.609758 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:14.609730 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 17:00:17.540949 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.540917 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c_9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e/main/0.log" Apr 23 17:00:17.541417 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.541397 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 17:00:17.651767 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.651657 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-tls-certs\") pod \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " Apr 23 17:00:17.651767 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.651730 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kserve-provision-location\") pod \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " Apr 23 17:00:17.651767 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.651763 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhktr\" (UniqueName: \"kubernetes.io/projected/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kube-api-access-rhktr\") pod \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " Apr 23 17:00:17.652076 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.651789 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-home\") pod \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " Apr 23 17:00:17.652076 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.651821 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-dshm\") pod \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " Apr 23 17:00:17.652242 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.652203 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-home" (OuterVolumeSpecName: "home") pod "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" (UID: "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:00:17.652409 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.652322 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-model-cache\") pod \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\" (UID: \"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e\") " Apr 23 17:00:17.652533 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.652509 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-model-cache" (OuterVolumeSpecName: "model-cache") pod "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" (UID: "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:00:17.652745 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.652722 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:00:17.652848 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.652748 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:00:17.654547 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.654474 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kube-api-access-rhktr" (OuterVolumeSpecName: "kube-api-access-rhktr") pod "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" (UID: "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e"). InnerVolumeSpecName "kube-api-access-rhktr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:00:17.654663 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.654618 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" (UID: "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:00:17.655149 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.655123 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-dshm" (OuterVolumeSpecName: "dshm") pod "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" (UID: "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:00:17.669537 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.669502 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" (UID: "9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:00:17.677471 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.677447 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c_9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e/main/0.log" Apr 23 17:00:17.677869 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.677831 2580 generic.go:358] "Generic (PLEG): container finished" podID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerID="39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4" exitCode=137 Apr 23 17:00:17.677960 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.677909 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" Apr 23 17:00:17.678019 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.677909 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" event={"ID":"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e","Type":"ContainerDied","Data":"39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4"} Apr 23 17:00:17.678019 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.678014 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c" event={"ID":"9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e","Type":"ContainerDied","Data":"b0af2b4295ff04f5338d8d07cc48d13da76a9e64d36fa8eda770e9a678a6046a"} Apr 23 17:00:17.678128 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.678030 2580 scope.go:117] "RemoveContainer" containerID="39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4" Apr 23 17:00:17.700717 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.700688 2580 scope.go:117] "RemoveContainer" containerID="0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0" Apr 23 17:00:17.706017 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.705989 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c"] Apr 23 17:00:17.709115 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.709090 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6db9476544b428c"] Apr 23 17:00:17.732770 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.732745 2580 scope.go:117] "RemoveContainer" containerID="39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4" Apr 23 17:00:17.733186 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:00:17.733161 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4\": container with ID starting with 39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4 not found: ID does not exist" containerID="39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4" Apr 23 17:00:17.733324 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.733192 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4"} err="failed to get container status \"39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4\": rpc error: code = NotFound desc = could not find container \"39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4\": container with ID starting with 39ea38983e00907610cb7bec6b92c86bcc331f2c37b69290a698634e28ea78a4 not found: ID does not exist" Apr 23 17:00:17.733324 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.733211 2580 scope.go:117] "RemoveContainer" containerID="0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0" Apr 23 17:00:17.733534 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:00:17.733511 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0\": container with ID starting with 0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0 not found: ID does not exist" containerID="0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0" Apr 23 17:00:17.733605 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.733538 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0"} err="failed to get container status \"0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0\": rpc error: code = NotFound desc = could not find container \"0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0\": container with ID starting with 0613b8e58a12a5e26016ad1764a2d90294fb0d9021a66773b79027875bca4dd0 not found: ID does not exist" Apr 23 17:00:17.754027 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.753989 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:00:17.754027 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.754022 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:00:17.754027 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.754032 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rhktr\" (UniqueName: \"kubernetes.io/projected/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-kube-api-access-rhktr\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:00:17.754316 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:17.754043 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:00:18.481274 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:18.481224 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.65:8000/health\": dial tcp 10.132.0.65:8000: connect: connection refused" Apr 23 17:00:18.558259 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:18.538489 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" path="/var/lib/kubelet/pods/9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e/volumes" Apr 23 17:00:20.623735 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:20.623687 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 17:00:20.664387 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:20.664337 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.64:8000/health\": dial tcp 10.132.0.64:8000: connect: connection refused" Apr 23 17:00:28.480982 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:28.480877 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 17:00:28.481672 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:28.481633 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.65:8000/health\": dial tcp 10.132.0.65:8000: connect: connection refused" Apr 23 17:00:30.622849 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:30.622801 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.63:8001/health\": dial tcp 10.132.0.63:8001: connect: connection refused" Apr 23 17:00:30.674831 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:30.674797 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 17:00:30.682984 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:30.682956 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 17:00:38.481027 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:38.480965 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.65:8000/health\": dial tcp 10.132.0.65:8000: connect: connection refused" Apr 23 17:00:40.632314 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:40.632264 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 17:00:40.649274 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:40.649236 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 17:00:48.481735 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:48.481676 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.65:8000/health\": dial tcp 10.132.0.65:8000: connect: connection refused" Apr 23 17:00:52.959869 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:52.959828 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx"] Apr 23 17:00:52.960403 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:52.960283 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" containerID="cri-o://54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a" gracePeriod=30 Apr 23 17:00:52.964948 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:52.964920 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft"] Apr 23 17:00:52.965438 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:52.965386 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" containerID="cri-o://6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3" gracePeriod=30 Apr 23 17:00:58.481709 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:00:58.481651 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.65:8000/health\": dial tcp 10.132.0.65:8000: connect: connection refused" Apr 23 17:01:00.445853 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.445812 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g"] Apr 23 17:01:00.446246 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.446233 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="storage-initializer" Apr 23 17:01:00.446318 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.446248 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="storage-initializer" Apr 23 17:01:00.446318 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.446256 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" Apr 23 17:01:00.446318 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.446261 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" Apr 23 17:01:00.446427 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.446347 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="9dcbcbff-ab49-4b7d-86d4-6eeecc35fd8e" containerName="main" Apr 23 17:01:00.451376 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.451354 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.454675 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.454652 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 23 17:01:00.462171 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.462147 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g"] Apr 23 17:01:00.531523 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.531485 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9mxn\" (UniqueName: \"kubernetes.io/projected/06e48821-aabf-4e37-9252-48411a197de1-kube-api-access-p9mxn\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.531725 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.531541 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.531725 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.531669 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/06e48821-aabf-4e37-9252-48411a197de1-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.531725 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.531718 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.531876 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.531769 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.531876 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.531791 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.632656 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.632604 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/06e48821-aabf-4e37-9252-48411a197de1-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.632656 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.632660 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.632938 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.632684 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.632938 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.632706 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.632938 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.632752 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9mxn\" (UniqueName: \"kubernetes.io/projected/06e48821-aabf-4e37-9252-48411a197de1-kube-api-access-p9mxn\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.632938 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.632808 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.633154 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.633094 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.633349 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.633280 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.633492 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.633368 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.635200 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.635177 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.635578 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.635561 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/06e48821-aabf-4e37-9252-48411a197de1-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.640805 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.640785 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9mxn\" (UniqueName: \"kubernetes.io/projected/06e48821-aabf-4e37-9252-48411a197de1-kube-api-access-p9mxn\") pod \"custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.765030 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.764998 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:00.930746 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:00.930717 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g"] Apr 23 17:01:00.932813 ip-10-0-128-198 kubenswrapper[2580]: W0423 17:01:00.932770 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06e48821_aabf_4e37_9252_48411a197de1.slice/crio-af12d036475833bad9035e19d46388b0c5df681ebdb9e123990d79475e2051f2 WatchSource:0}: Error finding container af12d036475833bad9035e19d46388b0c5df681ebdb9e123990d79475e2051f2: Status 404 returned error can't find the container with id af12d036475833bad9035e19d46388b0c5df681ebdb9e123990d79475e2051f2 Apr 23 17:01:01.884844 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:01.884809 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" event={"ID":"06e48821-aabf-4e37-9252-48411a197de1","Type":"ContainerStarted","Data":"cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830"} Apr 23 17:01:01.884844 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:01.884848 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" event={"ID":"06e48821-aabf-4e37-9252-48411a197de1","Type":"ContainerStarted","Data":"af12d036475833bad9035e19d46388b0c5df681ebdb9e123990d79475e2051f2"} Apr 23 17:01:05.902812 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:05.902778 2580 generic.go:358] "Generic (PLEG): container finished" podID="06e48821-aabf-4e37-9252-48411a197de1" containerID="cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830" exitCode=0 Apr 23 17:01:05.903171 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:05.902855 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" event={"ID":"06e48821-aabf-4e37-9252-48411a197de1","Type":"ContainerDied","Data":"cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830"} Apr 23 17:01:06.908942 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:06.908904 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" event={"ID":"06e48821-aabf-4e37-9252-48411a197de1","Type":"ContainerStarted","Data":"0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20"} Apr 23 17:01:06.930859 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:06.930808 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" podStartSLOduration=6.930794573 podStartE2EDuration="6.930794573s" podCreationTimestamp="2026-04-23 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:01:06.928028552 +0000 UTC m=+1552.989102806" watchObservedRunningTime="2026-04-23 17:01:06.930794573 +0000 UTC m=+1552.991868804" Apr 23 17:01:08.481000 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:08.480955 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.65:8000/health\": dial tcp 10.132.0.65:8000: connect: connection refused" Apr 23 17:01:10.765601 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:10.765566 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:10.765601 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:10.765611 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:01:10.767576 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:10.767541 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.66:8000/health\": dial tcp 10.132.0.66:8000: connect: connection refused" Apr 23 17:01:18.481184 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:18.481130 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.65:8000/health\": dial tcp 10.132.0.65:8000: connect: connection refused" Apr 23 17:01:20.766235 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:20.766196 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.66:8000/health\": dial tcp 10.132.0.66:8000: connect: connection refused" Apr 23 17:01:22.966458 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:22.966380 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="llm-d-routing-sidecar" containerID="cri-o://f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc" gracePeriod=2 Apr 23 17:01:23.453271 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.453244 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 17:01:23.456961 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.456934 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft_68b434c2-61f8-4afa-9a95-3443e031bff5/main/0.log" Apr 23 17:01:23.457670 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.457652 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 17:01:23.544916 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.544827 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-kserve-provision-location\") pod \"68b434c2-61f8-4afa-9a95-3443e031bff5\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " Apr 23 17:01:23.544916 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.544901 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv7fp\" (UniqueName: \"kubernetes.io/projected/a6a10f65-efca-46fc-aa37-3712b44c85bb-kube-api-access-nv7fp\") pod \"a6a10f65-efca-46fc-aa37-3712b44c85bb\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " Apr 23 17:01:23.545146 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.544951 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-dshm\") pod \"68b434c2-61f8-4afa-9a95-3443e031bff5\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " Apr 23 17:01:23.545146 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.544984 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/68b434c2-61f8-4afa-9a95-3443e031bff5-tls-certs\") pod \"68b434c2-61f8-4afa-9a95-3443e031bff5\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " Apr 23 17:01:23.545146 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.545016 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb6xg\" (UniqueName: \"kubernetes.io/projected/68b434c2-61f8-4afa-9a95-3443e031bff5-kube-api-access-nb6xg\") pod \"68b434c2-61f8-4afa-9a95-3443e031bff5\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " Apr 23 17:01:23.545146 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.545051 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-home\") pod \"a6a10f65-efca-46fc-aa37-3712b44c85bb\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " Apr 23 17:01:23.545146 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.545078 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a6a10f65-efca-46fc-aa37-3712b44c85bb-tls-certs\") pod \"a6a10f65-efca-46fc-aa37-3712b44c85bb\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " Apr 23 17:01:23.545146 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.545109 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-kserve-provision-location\") pod \"a6a10f65-efca-46fc-aa37-3712b44c85bb\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " Apr 23 17:01:23.545512 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.545166 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-model-cache\") pod \"68b434c2-61f8-4afa-9a95-3443e031bff5\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " Apr 23 17:01:23.545512 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.545194 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-dshm\") pod \"a6a10f65-efca-46fc-aa37-3712b44c85bb\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " Apr 23 17:01:23.545512 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.545227 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-model-cache\") pod \"a6a10f65-efca-46fc-aa37-3712b44c85bb\" (UID: \"a6a10f65-efca-46fc-aa37-3712b44c85bb\") " Apr 23 17:01:23.545512 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.545284 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-home\") pod \"68b434c2-61f8-4afa-9a95-3443e031bff5\" (UID: \"68b434c2-61f8-4afa-9a95-3443e031bff5\") " Apr 23 17:01:23.546675 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.546437 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-model-cache" (OuterVolumeSpecName: "model-cache") pod "a6a10f65-efca-46fc-aa37-3712b44c85bb" (UID: "a6a10f65-efca-46fc-aa37-3712b44c85bb"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:23.546675 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.546510 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-home" (OuterVolumeSpecName: "home") pod "a6a10f65-efca-46fc-aa37-3712b44c85bb" (UID: "a6a10f65-efca-46fc-aa37-3712b44c85bb"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:23.547323 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.547074 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-model-cache" (OuterVolumeSpecName: "model-cache") pod "68b434c2-61f8-4afa-9a95-3443e031bff5" (UID: "68b434c2-61f8-4afa-9a95-3443e031bff5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:23.547323 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.547272 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-home" (OuterVolumeSpecName: "home") pod "68b434c2-61f8-4afa-9a95-3443e031bff5" (UID: "68b434c2-61f8-4afa-9a95-3443e031bff5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:23.550063 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.549928 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6a10f65-efca-46fc-aa37-3712b44c85bb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a6a10f65-efca-46fc-aa37-3712b44c85bb" (UID: "a6a10f65-efca-46fc-aa37-3712b44c85bb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:01:23.550063 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.549946 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b434c2-61f8-4afa-9a95-3443e031bff5-kube-api-access-nb6xg" (OuterVolumeSpecName: "kube-api-access-nb6xg") pod "68b434c2-61f8-4afa-9a95-3443e031bff5" (UID: "68b434c2-61f8-4afa-9a95-3443e031bff5"). InnerVolumeSpecName "kube-api-access-nb6xg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:01:23.550063 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.549991 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a10f65-efca-46fc-aa37-3712b44c85bb-kube-api-access-nv7fp" (OuterVolumeSpecName: "kube-api-access-nv7fp") pod "a6a10f65-efca-46fc-aa37-3712b44c85bb" (UID: "a6a10f65-efca-46fc-aa37-3712b44c85bb"). InnerVolumeSpecName "kube-api-access-nv7fp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:01:23.550279 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.550093 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-dshm" (OuterVolumeSpecName: "dshm") pod "68b434c2-61f8-4afa-9a95-3443e031bff5" (UID: "68b434c2-61f8-4afa-9a95-3443e031bff5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:23.550454 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.550416 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-dshm" (OuterVolumeSpecName: "dshm") pod "a6a10f65-efca-46fc-aa37-3712b44c85bb" (UID: "a6a10f65-efca-46fc-aa37-3712b44c85bb"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:23.551586 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.551559 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b434c2-61f8-4afa-9a95-3443e031bff5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "68b434c2-61f8-4afa-9a95-3443e031bff5" (UID: "68b434c2-61f8-4afa-9a95-3443e031bff5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:01:23.572374 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.572264 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "68b434c2-61f8-4afa-9a95-3443e031bff5" (UID: "68b434c2-61f8-4afa-9a95-3443e031bff5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:23.580221 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.580181 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a6a10f65-efca-46fc-aa37-3712b44c85bb" (UID: "a6a10f65-efca-46fc-aa37-3712b44c85bb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:23.650385 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650338 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.650385 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650384 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.650633 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650399 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.650633 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650415 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.650633 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650429 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.650633 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650444 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nv7fp\" (UniqueName: \"kubernetes.io/projected/a6a10f65-efca-46fc-aa37-3712b44c85bb-kube-api-access-nv7fp\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.650633 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650458 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/68b434c2-61f8-4afa-9a95-3443e031bff5-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.650633 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650472 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/68b434c2-61f8-4afa-9a95-3443e031bff5-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.650633 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650488 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nb6xg\" (UniqueName: \"kubernetes.io/projected/68b434c2-61f8-4afa-9a95-3443e031bff5-kube-api-access-nb6xg\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.650633 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650501 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.650633 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650514 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a6a10f65-efca-46fc-aa37-3712b44c85bb-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.650633 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.650527 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6a10f65-efca-46fc-aa37-3712b44c85bb-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.984729 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.984700 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft_68b434c2-61f8-4afa-9a95-3443e031bff5/main/0.log" Apr 23 17:01:23.985481 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.985456 2580 generic.go:358] "Generic (PLEG): container finished" podID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerID="6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3" exitCode=137 Apr 23 17:01:23.985481 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.985481 2580 generic.go:358] "Generic (PLEG): container finished" podID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerID="f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc" exitCode=0 Apr 23 17:01:23.985652 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.985542 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" event={"ID":"68b434c2-61f8-4afa-9a95-3443e031bff5","Type":"ContainerDied","Data":"6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3"} Apr 23 17:01:23.985652 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.985570 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" Apr 23 17:01:23.985652 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.985595 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" event={"ID":"68b434c2-61f8-4afa-9a95-3443e031bff5","Type":"ContainerDied","Data":"f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc"} Apr 23 17:01:23.985652 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.985609 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft" event={"ID":"68b434c2-61f8-4afa-9a95-3443e031bff5","Type":"ContainerDied","Data":"dc635217d33aa074029c63900d61c52c72f41ccae7145d8abdbed7cf0bba89c1"} Apr 23 17:01:23.985652 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.985624 2580 scope.go:117] "RemoveContainer" containerID="6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3" Apr 23 17:01:23.988003 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.987975 2580 generic.go:358] "Generic (PLEG): container finished" podID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerID="54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a" exitCode=137 Apr 23 17:01:23.988104 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.988017 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" event={"ID":"a6a10f65-efca-46fc-aa37-3712b44c85bb","Type":"ContainerDied","Data":"54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a"} Apr 23 17:01:23.988104 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.988060 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" event={"ID":"a6a10f65-efca-46fc-aa37-3712b44c85bb","Type":"ContainerDied","Data":"d1f1112eac8b009e0c99f6b16a77c3e19c1e29d546ac4713decf0435275d3e38"} Apr 23 17:01:23.988225 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:23.988164 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx" Apr 23 17:01:24.014354 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.014328 2580 scope.go:117] "RemoveContainer" containerID="5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453" Apr 23 17:01:24.023667 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.023632 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx"] Apr 23 17:01:24.027657 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.027629 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-65m96rx"] Apr 23 17:01:24.043582 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.043521 2580 scope.go:117] "RemoveContainer" containerID="f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc" Apr 23 17:01:24.048164 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.048139 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft"] Apr 23 17:01:24.054508 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.054486 2580 scope.go:117] "RemoveContainer" containerID="6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3" Apr 23 17:01:24.054806 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:01:24.054782 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3\": container with ID starting with 6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3 not found: ID does not exist" containerID="6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3" Apr 23 17:01:24.054895 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.054871 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3"} err="failed to get container status \"6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3\": rpc error: code = NotFound desc = could not find container \"6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3\": container with ID starting with 6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3 not found: ID does not exist" Apr 23 17:01:24.054969 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.054902 2580 scope.go:117] "RemoveContainer" containerID="5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453" Apr 23 17:01:24.055185 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:01:24.055167 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453\": container with ID starting with 5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453 not found: ID does not exist" containerID="5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453" Apr 23 17:01:24.055241 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.055193 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453"} err="failed to get container status \"5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453\": rpc error: code = NotFound desc = could not find container \"5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453\": container with ID starting with 5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453 not found: ID does not exist" Apr 23 17:01:24.055241 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.055210 2580 scope.go:117] "RemoveContainer" containerID="f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc" Apr 23 17:01:24.055484 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:01:24.055459 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc\": container with ID starting with f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc not found: ID does not exist" containerID="f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc" Apr 23 17:01:24.055587 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.055489 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc"} err="failed to get container status \"f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc\": rpc error: code = NotFound desc = could not find container \"f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc\": container with ID starting with f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc not found: ID does not exist" Apr 23 17:01:24.055587 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.055503 2580 scope.go:117] "RemoveContainer" containerID="6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3" Apr 23 17:01:24.055770 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.055745 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3"} err="failed to get container status \"6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3\": rpc error: code = NotFound desc = could not find container \"6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3\": container with ID starting with 6c2d2fb1e7e999a0d56dc8a831e19ec7f8f651379512ca7cd1f0cfa7449e9de3 not found: ID does not exist" Apr 23 17:01:24.055825 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.055773 2580 scope.go:117] "RemoveContainer" containerID="5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453" Apr 23 17:01:24.056036 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.056016 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453"} err="failed to get container status \"5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453\": rpc error: code = NotFound desc = could not find container \"5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453\": container with ID starting with 5c692948aacc3d3f2c51507e9355b360432d5ecc04a266fbee62440cff7ee453 not found: ID does not exist" Apr 23 17:01:24.056120 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.056037 2580 scope.go:117] "RemoveContainer" containerID="f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc" Apr 23 17:01:24.056285 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.056266 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc"} err="failed to get container status \"f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc\": rpc error: code = NotFound desc = could not find container \"f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc\": container with ID starting with f0f34d6c20c11ab004e3e91716978ce530ccb0b09b6237345ae0d14941df26bc not found: ID does not exist" Apr 23 17:01:24.056364 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.056287 2580 scope.go:117] "RemoveContainer" containerID="54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a" Apr 23 17:01:24.057133 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.057108 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-ddc8448b7-4h8ft"] Apr 23 17:01:24.076536 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.076515 2580 scope.go:117] "RemoveContainer" containerID="5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02" Apr 23 17:01:24.112198 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.112167 2580 scope.go:117] "RemoveContainer" containerID="54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a" Apr 23 17:01:24.112520 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:01:24.112501 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a\": container with ID starting with 54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a not found: ID does not exist" containerID="54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a" Apr 23 17:01:24.112608 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.112527 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a"} err="failed to get container status \"54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a\": rpc error: code = NotFound desc = could not find container \"54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a\": container with ID starting with 54c3b71258b1624787128406e013fd540b9a4be990b4a55115d3fc6afda1f69a not found: ID does not exist" Apr 23 17:01:24.112608 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.112551 2580 scope.go:117] "RemoveContainer" containerID="5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02" Apr 23 17:01:24.112844 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:01:24.112815 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02\": container with ID starting with 5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02 not found: ID does not exist" containerID="5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02" Apr 23 17:01:24.112915 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.112853 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02"} err="failed to get container status \"5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02\": rpc error: code = NotFound desc = could not find container \"5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02\": container with ID starting with 5dfc3d21b40551e8ce1fbea7b093b7d764affa5ed406f86329f82f24e20a9f02 not found: ID does not exist" Apr 23 17:01:24.538972 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.538933 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" path="/var/lib/kubelet/pods/68b434c2-61f8-4afa-9a95-3443e031bff5/volumes" Apr 23 17:01:24.539697 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:24.539676 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" path="/var/lib/kubelet/pods/a6a10f65-efca-46fc-aa37-3712b44c85bb/volumes" Apr 23 17:01:28.481989 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:28.481925 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.65:8000/health\": dial tcp 10.132.0.65:8000: connect: connection refused" Apr 23 17:01:30.766140 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:30.766104 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.66:8000/health\": dial tcp 10.132.0.66:8000: connect: connection refused" Apr 23 17:01:38.481196 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:38.481150 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.65:8000/health\": dial tcp 10.132.0.65:8000: connect: connection refused" Apr 23 17:01:40.765673 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:40.765619 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.66:8000/health\": dial tcp 10.132.0.66:8000: connect: connection refused" Apr 23 17:01:48.491393 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:48.491359 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 17:01:48.499246 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:48.499215 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 17:01:50.766234 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:50.766119 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.66:8000/health\": dial tcp 10.132.0.66:8000: connect: connection refused" Apr 23 17:01:56.364222 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:56.364181 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 17:01:56.364706 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:56.364491 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" containerID="cri-o://1ed0aec055f133b80b6b4512e8bc019e05e98b409b555e35912c32eaa138684d" gracePeriod=30 Apr 23 17:01:57.143410 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.143274 2580 generic.go:358] "Generic (PLEG): container finished" podID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerID="1ed0aec055f133b80b6b4512e8bc019e05e98b409b555e35912c32eaa138684d" exitCode=0 Apr 23 17:01:57.143410 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.143326 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c831cb87-8c1e-44b9-946a-d7c02d40b92e","Type":"ContainerDied","Data":"1ed0aec055f133b80b6b4512e8bc019e05e98b409b555e35912c32eaa138684d"} Apr 23 17:01:57.218311 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.218270 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 17:01:57.362049 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.361967 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-model-cache\") pod \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " Apr 23 17:01:57.362049 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.362026 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c831cb87-8c1e-44b9-946a-d7c02d40b92e-tls-certs\") pod \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " Apr 23 17:01:57.362288 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.362066 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kserve-provision-location\") pod \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " Apr 23 17:01:57.362288 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.362098 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-dshm\") pod \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " Apr 23 17:01:57.362288 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.362126 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-home\") pod \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " Apr 23 17:01:57.362288 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.362151 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t986\" (UniqueName: \"kubernetes.io/projected/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kube-api-access-7t986\") pod \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\" (UID: \"c831cb87-8c1e-44b9-946a-d7c02d40b92e\") " Apr 23 17:01:57.362288 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.362206 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-model-cache" (OuterVolumeSpecName: "model-cache") pod "c831cb87-8c1e-44b9-946a-d7c02d40b92e" (UID: "c831cb87-8c1e-44b9-946a-d7c02d40b92e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:57.362595 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.362395 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:57.362655 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.362612 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-home" (OuterVolumeSpecName: "home") pod "c831cb87-8c1e-44b9-946a-d7c02d40b92e" (UID: "c831cb87-8c1e-44b9-946a-d7c02d40b92e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:57.364505 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.364331 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-dshm" (OuterVolumeSpecName: "dshm") pod "c831cb87-8c1e-44b9-946a-d7c02d40b92e" (UID: "c831cb87-8c1e-44b9-946a-d7c02d40b92e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:57.364505 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.364346 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kube-api-access-7t986" (OuterVolumeSpecName: "kube-api-access-7t986") pod "c831cb87-8c1e-44b9-946a-d7c02d40b92e" (UID: "c831cb87-8c1e-44b9-946a-d7c02d40b92e"). InnerVolumeSpecName "kube-api-access-7t986". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:01:57.364951 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.364520 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c831cb87-8c1e-44b9-946a-d7c02d40b92e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c831cb87-8c1e-44b9-946a-d7c02d40b92e" (UID: "c831cb87-8c1e-44b9-946a-d7c02d40b92e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:01:57.418539 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.418489 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c831cb87-8c1e-44b9-946a-d7c02d40b92e" (UID: "c831cb87-8c1e-44b9-946a-d7c02d40b92e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:57.463010 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.462969 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c831cb87-8c1e-44b9-946a-d7c02d40b92e-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:57.463010 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.462998 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:57.463010 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.463007 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:57.463010 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.463016 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c831cb87-8c1e-44b9-946a-d7c02d40b92e-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:57.463369 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:57.463025 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7t986\" (UniqueName: \"kubernetes.io/projected/c831cb87-8c1e-44b9-946a-d7c02d40b92e-kube-api-access-7t986\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:01:58.149386 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.149286 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c831cb87-8c1e-44b9-946a-d7c02d40b92e","Type":"ContainerDied","Data":"ff9a4c6112ba1c6b613ad2605c897dd67eeeae7c0e6879386ea813279f8fa4b1"} Apr 23 17:01:58.149386 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.149364 2580 scope.go:117] "RemoveContainer" containerID="1ed0aec055f133b80b6b4512e8bc019e05e98b409b555e35912c32eaa138684d" Apr 23 17:01:58.149386 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.149370 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 17:01:58.174844 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.174820 2580 scope.go:117] "RemoveContainer" containerID="0d39b759f07b8919d07bb31a95f96645abd07c948624dd3598486ab0123e1785" Apr 23 17:01:58.180240 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.180196 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 17:01:58.184110 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.184082 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 17:01:58.538467 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.538434 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" path="/var/lib/kubelet/pods/c831cb87-8c1e-44b9-946a-d7c02d40b92e/volumes" Apr 23 17:01:58.874150 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874064 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj"] Apr 23 17:01:58.874892 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874866 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" Apr 23 17:01:58.874892 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874891 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874902 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874908 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874922 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="storage-initializer" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874930 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="storage-initializer" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874943 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="storage-initializer" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874948 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="storage-initializer" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874968 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874974 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874983 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="storage-initializer" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874989 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="storage-initializer" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.874998 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="llm-d-routing-sidecar" Apr 23 17:01:58.875050 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.875004 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="llm-d-routing-sidecar" Apr 23 17:01:58.875506 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.875071 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="llm-d-routing-sidecar" Apr 23 17:01:58.875506 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.875083 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6a10f65-efca-46fc-aa37-3712b44c85bb" containerName="main" Apr 23 17:01:58.875506 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.875090 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c831cb87-8c1e-44b9-946a-d7c02d40b92e" containerName="main" Apr 23 17:01:58.875506 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.875096 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="68b434c2-61f8-4afa-9a95-3443e031bff5" containerName="main" Apr 23 17:01:58.880442 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.880415 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:58.883226 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.883203 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 23 17:01:58.891400 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.890360 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj"] Apr 23 17:01:58.976581 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.976548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:58.976581 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.976587 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqj6s\" (UniqueName: \"kubernetes.io/projected/2bffe611-bf91-4e96-b44e-e2a68e638034-kube-api-access-sqj6s\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:58.976787 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.976626 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-home\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:58.976787 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.976686 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2bffe611-bf91-4e96-b44e-e2a68e638034-tls-certs\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:58.976787 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.976736 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-dshm\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:58.976787 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:58.976778 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-model-cache\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.078146 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.078108 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-home\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.078146 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.078153 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2bffe611-bf91-4e96-b44e-e2a68e638034-tls-certs\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.078427 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.078188 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-dshm\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.078427 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.078217 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-model-cache\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.078427 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.078264 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.078427 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.078316 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqj6s\" (UniqueName: \"kubernetes.io/projected/2bffe611-bf91-4e96-b44e-e2a68e638034-kube-api-access-sqj6s\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.078668 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.078640 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-model-cache\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.078767 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.078744 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.078871 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.078847 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-home\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.081028 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.081008 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-dshm\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.081428 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.081408 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2bffe611-bf91-4e96-b44e-e2a68e638034-tls-certs\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.086590 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.086570 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqj6s\" (UniqueName: \"kubernetes.io/projected/2bffe611-bf91-4e96-b44e-e2a68e638034-kube-api-access-sqj6s\") pod \"scheduler-inline-config-test-kserve-68649cc7f9-gg7gj\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.156947 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.156875 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2"] Apr 23 17:01:59.167592 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.166679 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.169457 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.169422 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-prchk\"" Apr 23 17:01:59.169868 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.169832 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2"] Apr 23 17:01:59.196898 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.196863 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:01:59.280805 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.280711 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.280805 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.280791 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.281075 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.280840 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.281075 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.280911 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wbvk\" (UniqueName: \"kubernetes.io/projected/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kube-api-access-6wbvk\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.281075 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.280988 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.281249 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.281082 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.349934 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.349905 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj"] Apr 23 17:01:59.351999 ip-10-0-128-198 kubenswrapper[2580]: W0423 17:01:59.351972 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bffe611_bf91_4e96_b44e_e2a68e638034.slice/crio-96c3a2df92669720503ec643a3d6a201d8f47b95c8084d1726cd4d6ca2138d24 WatchSource:0}: Error finding container 96c3a2df92669720503ec643a3d6a201d8f47b95c8084d1726cd4d6ca2138d24: Status 404 returned error can't find the container with id 96c3a2df92669720503ec643a3d6a201d8f47b95c8084d1726cd4d6ca2138d24 Apr 23 17:01:59.383321 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.383235 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.383321 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.383332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.383657 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.383377 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wbvk\" (UniqueName: \"kubernetes.io/projected/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kube-api-access-6wbvk\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.383657 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.383420 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.383657 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.383521 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.383657 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.383619 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.384895 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.384562 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.384895 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.384676 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.384895 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.384834 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.385175 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.384957 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.388122 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.388086 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.401792 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.401755 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wbvk\" (UniqueName: \"kubernetes.io/projected/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kube-api-access-6wbvk\") pod \"scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.478465 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.478429 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:01:59.646269 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:01:59.646242 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2"] Apr 23 17:01:59.648520 ip-10-0-128-198 kubenswrapper[2580]: W0423 17:01:59.648491 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e6f482a_2e46_4db9_b716_b9d30c3daf43.slice/crio-f290d4591c0c1332aa460d1609ef4726c8233e127b89f0d15e6e00fffb3746a1 WatchSource:0}: Error finding container f290d4591c0c1332aa460d1609ef4726c8233e127b89f0d15e6e00fffb3746a1: Status 404 returned error can't find the container with id f290d4591c0c1332aa460d1609ef4726c8233e127b89f0d15e6e00fffb3746a1 Apr 23 17:02:00.170155 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:00.170096 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" event={"ID":"2e6f482a-2e46-4db9-b716-b9d30c3daf43","Type":"ContainerStarted","Data":"10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a"} Apr 23 17:02:00.170155 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:00.170154 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" event={"ID":"2e6f482a-2e46-4db9-b716-b9d30c3daf43","Type":"ContainerStarted","Data":"f290d4591c0c1332aa460d1609ef4726c8233e127b89f0d15e6e00fffb3746a1"} Apr 23 17:02:00.171948 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:00.171916 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" event={"ID":"2bffe611-bf91-4e96-b44e-e2a68e638034","Type":"ContainerStarted","Data":"6b64854c0dfc11e0374d50411a9061896193a6d5f123691687ca1b20389c1d45"} Apr 23 17:02:00.172048 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:00.171969 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" event={"ID":"2bffe611-bf91-4e96-b44e-e2a68e638034","Type":"ContainerStarted","Data":"96c3a2df92669720503ec643a3d6a201d8f47b95c8084d1726cd4d6ca2138d24"} Apr 23 17:02:00.766404 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:00.766363 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.66:8000/health\": dial tcp 10.132.0.66:8000: connect: connection refused" Apr 23 17:02:01.180498 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:01.180379 2580 generic.go:358] "Generic (PLEG): container finished" podID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerID="10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a" exitCode=0 Apr 23 17:02:01.180689 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:01.180503 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" event={"ID":"2e6f482a-2e46-4db9-b716-b9d30c3daf43","Type":"ContainerDied","Data":"10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a"} Apr 23 17:02:02.189097 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:02.189052 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" event={"ID":"2e6f482a-2e46-4db9-b716-b9d30c3daf43","Type":"ContainerStarted","Data":"52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36"} Apr 23 17:02:02.189097 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:02.189107 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" event={"ID":"2e6f482a-2e46-4db9-b716-b9d30c3daf43","Type":"ContainerStarted","Data":"9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a"} Apr 23 17:02:02.189574 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:02.189169 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:02:02.212375 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:02.212314 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" podStartSLOduration=3.212281378 podStartE2EDuration="3.212281378s" podCreationTimestamp="2026-04-23 17:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:02:02.209328943 +0000 UTC m=+1608.270403208" watchObservedRunningTime="2026-04-23 17:02:02.212281378 +0000 UTC m=+1608.273355631" Apr 23 17:02:04.199212 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:04.199176 2580 generic.go:358] "Generic (PLEG): container finished" podID="2bffe611-bf91-4e96-b44e-e2a68e638034" containerID="6b64854c0dfc11e0374d50411a9061896193a6d5f123691687ca1b20389c1d45" exitCode=0 Apr 23 17:02:04.199686 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:04.199256 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" event={"ID":"2bffe611-bf91-4e96-b44e-e2a68e638034","Type":"ContainerDied","Data":"6b64854c0dfc11e0374d50411a9061896193a6d5f123691687ca1b20389c1d45"} Apr 23 17:02:05.206055 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:05.206012 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" event={"ID":"2bffe611-bf91-4e96-b44e-e2a68e638034","Type":"ContainerStarted","Data":"837a46fc5d31c5f20f15506eb1dcc89a53be3a8b4346b10f16db86265a3389c4"} Apr 23 17:02:05.227313 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:05.227232 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" podStartSLOduration=7.227213055 podStartE2EDuration="7.227213055s" podCreationTimestamp="2026-04-23 17:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:02:05.224935218 +0000 UTC m=+1611.286009473" watchObservedRunningTime="2026-04-23 17:02:05.227213055 +0000 UTC m=+1611.288287310" Apr 23 17:02:09.197251 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:09.197206 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:02:09.197741 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:09.197363 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:02:09.210435 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:09.210408 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:02:09.237709 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:09.237675 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:02:09.478749 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:09.478709 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:02:09.478901 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:09.478864 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:02:09.481613 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:09.481590 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:02:10.231610 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:10.231583 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:02:10.766433 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:10.766391 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.66:8000/health\": dial tcp 10.132.0.66:8000: connect: connection refused" Apr 23 17:02:20.765753 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:20.765713 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.66:8000/health\": dial tcp 10.132.0.66:8000: connect: connection refused" Apr 23 17:02:30.766005 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:30.765951 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.66:8000/health\": dial tcp 10.132.0.66:8000: connect: connection refused" Apr 23 17:02:32.240056 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:32.240027 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:02:33.152101 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.152062 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2"] Apr 23 17:02:33.152468 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.152403 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" podUID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerName="main" containerID="cri-o://9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a" gracePeriod=30 Apr 23 17:02:33.152679 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.152434 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" podUID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerName="tokenizer" containerID="cri-o://52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36" gracePeriod=30 Apr 23 17:02:33.157786 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.157760 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj"] Apr 23 17:02:33.158273 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.158233 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" podUID="2bffe611-bf91-4e96-b44e-e2a68e638034" containerName="main" containerID="cri-o://837a46fc5d31c5f20f15506eb1dcc89a53be3a8b4346b10f16db86265a3389c4" gracePeriod=30 Apr 23 17:02:33.335616 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.335452 2580 generic.go:358] "Generic (PLEG): container finished" podID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerID="9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a" exitCode=0 Apr 23 17:02:33.335616 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.335544 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" event={"ID":"2e6f482a-2e46-4db9-b716-b9d30c3daf43","Type":"ContainerDied","Data":"9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a"} Apr 23 17:02:33.338489 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.338457 2580 generic.go:358] "Generic (PLEG): container finished" podID="2bffe611-bf91-4e96-b44e-e2a68e638034" containerID="837a46fc5d31c5f20f15506eb1dcc89a53be3a8b4346b10f16db86265a3389c4" exitCode=0 Apr 23 17:02:33.338605 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.338525 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" event={"ID":"2bffe611-bf91-4e96-b44e-e2a68e638034","Type":"ContainerDied","Data":"837a46fc5d31c5f20f15506eb1dcc89a53be3a8b4346b10f16db86265a3389c4"} Apr 23 17:02:33.419919 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.419893 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:02:33.503543 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.503510 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2bffe611-bf91-4e96-b44e-e2a68e638034-tls-certs\") pod \"2bffe611-bf91-4e96-b44e-e2a68e638034\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " Apr 23 17:02:33.503750 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.503564 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-home\") pod \"2bffe611-bf91-4e96-b44e-e2a68e638034\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " Apr 23 17:02:33.503750 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.503588 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-dshm\") pod \"2bffe611-bf91-4e96-b44e-e2a68e638034\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " Apr 23 17:02:33.503750 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.503618 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-model-cache\") pod \"2bffe611-bf91-4e96-b44e-e2a68e638034\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " Apr 23 17:02:33.503750 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.503648 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqj6s\" (UniqueName: \"kubernetes.io/projected/2bffe611-bf91-4e96-b44e-e2a68e638034-kube-api-access-sqj6s\") pod \"2bffe611-bf91-4e96-b44e-e2a68e638034\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " Apr 23 17:02:33.503750 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.503710 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-kserve-provision-location\") pod \"2bffe611-bf91-4e96-b44e-e2a68e638034\" (UID: \"2bffe611-bf91-4e96-b44e-e2a68e638034\") " Apr 23 17:02:33.504024 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.503856 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-home" (OuterVolumeSpecName: "home") pod "2bffe611-bf91-4e96-b44e-e2a68e638034" (UID: "2bffe611-bf91-4e96-b44e-e2a68e638034"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:33.504082 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.504061 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-model-cache" (OuterVolumeSpecName: "model-cache") pod "2bffe611-bf91-4e96-b44e-e2a68e638034" (UID: "2bffe611-bf91-4e96-b44e-e2a68e638034"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:33.504231 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.504005 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:33.506226 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.506177 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-dshm" (OuterVolumeSpecName: "dshm") pod "2bffe611-bf91-4e96-b44e-e2a68e638034" (UID: "2bffe611-bf91-4e96-b44e-e2a68e638034"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:33.506226 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.506191 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bffe611-bf91-4e96-b44e-e2a68e638034-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2bffe611-bf91-4e96-b44e-e2a68e638034" (UID: "2bffe611-bf91-4e96-b44e-e2a68e638034"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:02:33.506431 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.506189 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bffe611-bf91-4e96-b44e-e2a68e638034-kube-api-access-sqj6s" (OuterVolumeSpecName: "kube-api-access-sqj6s") pod "2bffe611-bf91-4e96-b44e-e2a68e638034" (UID: "2bffe611-bf91-4e96-b44e-e2a68e638034"). InnerVolumeSpecName "kube-api-access-sqj6s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:02:33.569037 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.568994 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2bffe611-bf91-4e96-b44e-e2a68e638034" (UID: "2bffe611-bf91-4e96-b44e-e2a68e638034"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:33.604630 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.604594 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:33.604630 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.604625 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2bffe611-bf91-4e96-b44e-e2a68e638034-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:33.604630 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.604636 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:33.604861 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.604645 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2bffe611-bf91-4e96-b44e-e2a68e638034-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:33.604861 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:33.604653 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sqj6s\" (UniqueName: \"kubernetes.io/projected/2bffe611-bf91-4e96-b44e-e2a68e638034-kube-api-access-sqj6s\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:34.344218 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.344190 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" Apr 23 17:02:34.344653 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.344179 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj" event={"ID":"2bffe611-bf91-4e96-b44e-e2a68e638034","Type":"ContainerDied","Data":"96c3a2df92669720503ec643a3d6a201d8f47b95c8084d1726cd4d6ca2138d24"} Apr 23 17:02:34.344653 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.344347 2580 scope.go:117] "RemoveContainer" containerID="837a46fc5d31c5f20f15506eb1dcc89a53be3a8b4346b10f16db86265a3389c4" Apr 23 17:02:34.365285 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.365263 2580 scope.go:117] "RemoveContainer" containerID="6b64854c0dfc11e0374d50411a9061896193a6d5f123691687ca1b20389c1d45" Apr 23 17:02:34.373834 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.373105 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj"] Apr 23 17:02:34.376899 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.376871 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-68649cc7f9-gg7gj"] Apr 23 17:02:34.515671 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.515648 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:02:34.542482 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.542384 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bffe611-bf91-4e96-b44e-e2a68e638034" path="/var/lib/kubelet/pods/2bffe611-bf91-4e96-b44e-e2a68e638034/volumes" Apr 23 17:02:34.613400 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.613368 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kserve-provision-location\") pod \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " Apr 23 17:02:34.613549 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.613425 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-tmp\") pod \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " Apr 23 17:02:34.613549 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.613453 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tls-certs\") pod \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " Apr 23 17:02:34.613549 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.613471 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-cache\") pod \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " Apr 23 17:02:34.613549 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.613492 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-uds\") pod \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " Apr 23 17:02:34.613549 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.613527 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wbvk\" (UniqueName: \"kubernetes.io/projected/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kube-api-access-6wbvk\") pod \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\" (UID: \"2e6f482a-2e46-4db9-b716-b9d30c3daf43\") " Apr 23 17:02:34.613867 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.613831 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2e6f482a-2e46-4db9-b716-b9d30c3daf43" (UID: "2e6f482a-2e46-4db9-b716-b9d30c3daf43"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:34.613934 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.613884 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2e6f482a-2e46-4db9-b716-b9d30c3daf43" (UID: "2e6f482a-2e46-4db9-b716-b9d30c3daf43"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:34.614125 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.614101 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2e6f482a-2e46-4db9-b716-b9d30c3daf43" (UID: "2e6f482a-2e46-4db9-b716-b9d30c3daf43"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:34.614565 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.614535 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2e6f482a-2e46-4db9-b716-b9d30c3daf43" (UID: "2e6f482a-2e46-4db9-b716-b9d30c3daf43"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:34.615764 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.615738 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2e6f482a-2e46-4db9-b716-b9d30c3daf43" (UID: "2e6f482a-2e46-4db9-b716-b9d30c3daf43"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:02:34.615764 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.615746 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kube-api-access-6wbvk" (OuterVolumeSpecName: "kube-api-access-6wbvk") pod "2e6f482a-2e46-4db9-b716-b9d30c3daf43" (UID: "2e6f482a-2e46-4db9-b716-b9d30c3daf43"). InnerVolumeSpecName "kube-api-access-6wbvk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:02:34.714691 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.714652 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:34.714691 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.714685 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-tmp\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:34.714691 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.714699 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:34.714932 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.714710 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:34.714932 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.714722 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2e6f482a-2e46-4db9-b716-b9d30c3daf43-tokenizer-uds\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:34.714932 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:34.714733 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6wbvk\" (UniqueName: \"kubernetes.io/projected/2e6f482a-2e46-4db9-b716-b9d30c3daf43-kube-api-access-6wbvk\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:02:35.349719 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.349685 2580 generic.go:358] "Generic (PLEG): container finished" podID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerID="52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36" exitCode=0 Apr 23 17:02:35.350181 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.349761 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" Apr 23 17:02:35.350181 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.349761 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" event={"ID":"2e6f482a-2e46-4db9-b716-b9d30c3daf43","Type":"ContainerDied","Data":"52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36"} Apr 23 17:02:35.350181 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.349804 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2" event={"ID":"2e6f482a-2e46-4db9-b716-b9d30c3daf43","Type":"ContainerDied","Data":"f290d4591c0c1332aa460d1609ef4726c8233e127b89f0d15e6e00fffb3746a1"} Apr 23 17:02:35.350181 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.349825 2580 scope.go:117] "RemoveContainer" containerID="52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36" Apr 23 17:02:35.358785 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.358759 2580 scope.go:117] "RemoveContainer" containerID="9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a" Apr 23 17:02:35.367863 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.367842 2580 scope.go:117] "RemoveContainer" containerID="10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a" Apr 23 17:02:35.374061 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.374034 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2"] Apr 23 17:02:35.385231 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.385206 2580 scope.go:117] "RemoveContainer" containerID="52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36" Apr 23 17:02:35.385654 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:02:35.385623 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36\": container with ID starting with 52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36 not found: ID does not exist" containerID="52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36" Apr 23 17:02:35.385782 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.385667 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36"} err="failed to get container status \"52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36\": rpc error: code = NotFound desc = could not find container \"52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36\": container with ID starting with 52acdccada47dfe7a4cc37b0dfa6260b338310a8a8564cd6d668b0cf68071b36 not found: ID does not exist" Apr 23 17:02:35.385782 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.385695 2580 scope.go:117] "RemoveContainer" containerID="9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a" Apr 23 17:02:35.386049 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:02:35.386017 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a\": container with ID starting with 9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a not found: ID does not exist" containerID="9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a" Apr 23 17:02:35.386127 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.386068 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a"} err="failed to get container status \"9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a\": rpc error: code = NotFound desc = could not find container \"9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a\": container with ID starting with 9d243f4171553df14f4d83c02f6254dd8db413b795f986de2d10be2e46bd3d7a not found: ID does not exist" Apr 23 17:02:35.386127 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.386093 2580 scope.go:117] "RemoveContainer" containerID="10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a" Apr 23 17:02:35.386420 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:02:35.386397 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a\": container with ID starting with 10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a not found: ID does not exist" containerID="10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a" Apr 23 17:02:35.386497 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.386430 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a"} err="failed to get container status \"10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a\": rpc error: code = NotFound desc = could not find container \"10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a\": container with ID starting with 10288fb2d46caa569a41489013fabd8bc8c348bdfca5d5f99e49300607c7b77a not found: ID does not exist" Apr 23 17:02:35.386833 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:35.386804 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-64bbf7x8d2"] Apr 23 17:02:36.537481 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:36.537442 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" path="/var/lib/kubelet/pods/2e6f482a-2e46-4db9-b716-b9d30c3daf43/volumes" Apr 23 17:02:40.775783 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:40.775756 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:02:40.784850 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:02:40.784821 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:03:02.105483 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.105448 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g"] Apr 23 17:03:02.105917 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.105764 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" containerID="cri-o://0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20" gracePeriod=30 Apr 23 17:03:02.188140 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188103 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx"] Apr 23 17:03:02.188617 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188598 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerName="tokenizer" Apr 23 17:03:02.188617 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188619 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerName="tokenizer" Apr 23 17:03:02.188829 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188642 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bffe611-bf91-4e96-b44e-e2a68e638034" containerName="main" Apr 23 17:03:02.188829 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188650 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bffe611-bf91-4e96-b44e-e2a68e638034" containerName="main" Apr 23 17:03:02.188829 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188662 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerName="main" Apr 23 17:03:02.188829 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188670 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerName="main" Apr 23 17:03:02.188829 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188694 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bffe611-bf91-4e96-b44e-e2a68e638034" containerName="storage-initializer" Apr 23 17:03:02.188829 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188703 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bffe611-bf91-4e96-b44e-e2a68e638034" containerName="storage-initializer" Apr 23 17:03:02.188829 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188713 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerName="storage-initializer" Apr 23 17:03:02.188829 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188721 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerName="storage-initializer" Apr 23 17:03:02.188829 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188850 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerName="main" Apr 23 17:03:02.189383 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188866 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e6f482a-2e46-4db9-b716-b9d30c3daf43" containerName="tokenizer" Apr 23 17:03:02.189383 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.188877 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bffe611-bf91-4e96-b44e-e2a68e638034" containerName="main" Apr 23 17:03:02.192677 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.192655 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.195392 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.195367 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-lgltw\"" Apr 23 17:03:02.195482 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.195433 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 23 17:03:02.202088 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.202059 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx"] Apr 23 17:03:02.260344 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.260284 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b1752178-f74d-452b-ad33-b8e28f685826-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.260555 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.260372 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.260555 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.260420 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.260555 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.260453 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b1752178-f74d-452b-ad33-b8e28f685826-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.260555 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.260482 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.260555 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.260508 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwqtl\" (UniqueName: \"kubernetes.io/projected/b1752178-f74d-452b-ad33-b8e28f685826-kube-api-access-qwqtl\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.260770 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.260569 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.260770 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.260610 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.260770 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.260643 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b1752178-f74d-452b-ad33-b8e28f685826-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.361931 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.361838 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b1752178-f74d-452b-ad33-b8e28f685826-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362105 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.361926 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362105 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.361988 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362105 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.362022 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b1752178-f74d-452b-ad33-b8e28f685826-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362105 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.362053 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362105 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.362077 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwqtl\" (UniqueName: \"kubernetes.io/projected/b1752178-f74d-452b-ad33-b8e28f685826-kube-api-access-qwqtl\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362105 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.362099 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362478 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.362123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362478 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.362153 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b1752178-f74d-452b-ad33-b8e28f685826-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362667 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.362519 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362667 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.362628 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362788 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.362769 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362872 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.362840 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.362995 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.362855 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b1752178-f74d-452b-ad33-b8e28f685826-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.364342 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.364321 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b1752178-f74d-452b-ad33-b8e28f685826-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.365011 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.364990 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b1752178-f74d-452b-ad33-b8e28f685826-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.370948 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.370921 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwqtl\" (UniqueName: \"kubernetes.io/projected/b1752178-f74d-452b-ad33-b8e28f685826-kube-api-access-qwqtl\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.371367 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.371344 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b1752178-f74d-452b-ad33-b8e28f685826-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-s5qjx\" (UID: \"b1752178-f74d-452b-ad33-b8e28f685826\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.505150 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.505108 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:02.648247 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.648220 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx"] Apr 23 17:03:02.651153 ip-10-0-128-198 kubenswrapper[2580]: W0423 17:03:02.651124 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1752178_f74d_452b_ad33_b8e28f685826.slice/crio-bd25bc726083d93260c9950aca9c03f7829bbf925573379cfb8fbd8bc5312f03 WatchSource:0}: Error finding container bd25bc726083d93260c9950aca9c03f7829bbf925573379cfb8fbd8bc5312f03: Status 404 returned error can't find the container with id bd25bc726083d93260c9950aca9c03f7829bbf925573379cfb8fbd8bc5312f03 Apr 23 17:03:02.653134 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:02.653113 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:03:03.469589 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:03.469542 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" event={"ID":"b1752178-f74d-452b-ad33-b8e28f685826","Type":"ContainerStarted","Data":"bd25bc726083d93260c9950aca9c03f7829bbf925573379cfb8fbd8bc5312f03"} Apr 23 17:03:04.875225 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.875187 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv"] Apr 23 17:03:04.879496 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.879466 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:04.882278 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.882254 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-4lqt8\"" Apr 23 17:03:04.882428 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.882313 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 23 17:03:04.892494 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.892469 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv"] Apr 23 17:03:04.939582 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.939510 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 23 17:03:04.939722 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.939629 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 23 17:03:04.939722 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.939682 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 23 17:03:04.990171 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.990147 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:04.990324 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.990213 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c156d5-a210-42eb-93ed-57a1a1998a2b-tls-certs\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:04.990324 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.990246 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrkqx\" (UniqueName: \"kubernetes.io/projected/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kube-api-access-rrkqx\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:04.990324 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.990272 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-dshm\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:04.990324 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.990314 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-home\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:04.990495 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:04.990361 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-model-cache\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.090936 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.090889 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.091233 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.090950 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c156d5-a210-42eb-93ed-57a1a1998a2b-tls-certs\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.091233 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.090979 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrkqx\" (UniqueName: \"kubernetes.io/projected/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kube-api-access-rrkqx\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.091233 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.091008 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-dshm\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.091233 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.091034 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-home\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.091233 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.091061 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-model-cache\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.091551 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.091271 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.091551 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.091404 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-model-cache\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.091779 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.091754 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-home\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.093717 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.093685 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c156d5-a210-42eb-93ed-57a1a1998a2b-tls-certs\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.093857 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.093739 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-dshm\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.100654 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.100608 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrkqx\" (UniqueName: \"kubernetes.io/projected/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kube-api-access-rrkqx\") pod \"router-with-refs-pd-test-kserve-5d668c8c46-wlvlv\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.189699 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.189608 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.324134 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.324103 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv"] Apr 23 17:03:05.325526 ip-10-0-128-198 kubenswrapper[2580]: W0423 17:03:05.325491 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4c156d5_a210_42eb_93ed_57a1a1998a2b.slice/crio-c151214cb5a000fcfad20c50c7fe733e7686c150a7120c4e7b3d16c5c02778e6 WatchSource:0}: Error finding container c151214cb5a000fcfad20c50c7fe733e7686c150a7120c4e7b3d16c5c02778e6: Status 404 returned error can't find the container with id c151214cb5a000fcfad20c50c7fe733e7686c150a7120c4e7b3d16c5c02778e6 Apr 23 17:03:05.479810 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.479772 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" event={"ID":"b4c156d5-a210-42eb-93ed-57a1a1998a2b","Type":"ContainerStarted","Data":"14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9"} Apr 23 17:03:05.479810 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.479814 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" event={"ID":"b4c156d5-a210-42eb-93ed-57a1a1998a2b","Type":"ContainerStarted","Data":"c151214cb5a000fcfad20c50c7fe733e7686c150a7120c4e7b3d16c5c02778e6"} Apr 23 17:03:05.480082 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.479851 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:05.481162 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.481132 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" event={"ID":"b1752178-f74d-452b-ad33-b8e28f685826","Type":"ContainerStarted","Data":"08f60ac90e5d7251d0c5719a70608a54bd7d71c3d31fd0c04e774d03e0da1fd6"} Apr 23 17:03:05.505959 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.505939 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:05.518535 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.518498 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" podStartSLOduration=1.232691382 podStartE2EDuration="3.518485817s" podCreationTimestamp="2026-04-23 17:03:02 +0000 UTC" firstStartedPulling="2026-04-23 17:03:02.65332167 +0000 UTC m=+1668.714395902" lastFinishedPulling="2026-04-23 17:03:04.939116092 +0000 UTC m=+1671.000190337" observedRunningTime="2026-04-23 17:03:05.517256188 +0000 UTC m=+1671.578330448" watchObservedRunningTime="2026-04-23 17:03:05.518485817 +0000 UTC m=+1671.579560071" Apr 23 17:03:05.569778 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.569755 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q"] Apr 23 17:03:05.573629 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.573611 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.576287 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.576268 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-gwpcg\"" Apr 23 17:03:05.582431 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.582406 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q"] Apr 23 17:03:05.699316 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.697692 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.699316 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.697747 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3197270b-bdb5-4935-a528-aa7103cfa4ad-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.699316 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.697825 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trq59\" (UniqueName: \"kubernetes.io/projected/3197270b-bdb5-4935-a528-aa7103cfa4ad-kube-api-access-trq59\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.699316 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.697998 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.699316 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.698037 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.699316 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.698083 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.799067 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.798961 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trq59\" (UniqueName: \"kubernetes.io/projected/3197270b-bdb5-4935-a528-aa7103cfa4ad-kube-api-access-trq59\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.799255 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.799097 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.799255 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.799160 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.799255 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.799191 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.799255 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.799232 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.799500 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.799261 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3197270b-bdb5-4935-a528-aa7103cfa4ad-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.799609 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.799582 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.799679 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.799620 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.799900 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.799869 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.799986 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.799924 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.802954 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.802923 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3197270b-bdb5-4935-a528-aa7103cfa4ad-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.809156 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.809121 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trq59\" (UniqueName: \"kubernetes.io/projected/3197270b-bdb5-4935-a528-aa7103cfa4ad-kube-api-access-trq59\") pod \"router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:05.884868 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:05.884831 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:06.259578 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:06.259551 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q"] Apr 23 17:03:06.261441 ip-10-0-128-198 kubenswrapper[2580]: W0423 17:03:06.261411 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3197270b_bdb5_4935_a528_aa7103cfa4ad.slice/crio-fa46423fe54a6e1ac8354c6055a14ba7bfe7b65ef86c1b7a8df9e2bbec1647aa WatchSource:0}: Error finding container fa46423fe54a6e1ac8354c6055a14ba7bfe7b65ef86c1b7a8df9e2bbec1647aa: Status 404 returned error can't find the container with id fa46423fe54a6e1ac8354c6055a14ba7bfe7b65ef86c1b7a8df9e2bbec1647aa Apr 23 17:03:06.488807 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:06.488755 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" event={"ID":"b4c156d5-a210-42eb-93ed-57a1a1998a2b","Type":"ContainerStarted","Data":"0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256"} Apr 23 17:03:06.491143 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:06.491104 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" event={"ID":"3197270b-bdb5-4935-a528-aa7103cfa4ad","Type":"ContainerStarted","Data":"5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab"} Apr 23 17:03:06.491322 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:06.491148 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" event={"ID":"3197270b-bdb5-4935-a528-aa7103cfa4ad","Type":"ContainerStarted","Data":"fa46423fe54a6e1ac8354c6055a14ba7bfe7b65ef86c1b7a8df9e2bbec1647aa"} Apr 23 17:03:06.506935 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:06.506885 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" podUID="b1752178-f74d-452b-ad33-b8e28f685826" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.69:15021/healthz/ready\": dial tcp 10.132.0.69:15021: connect: connection refused" Apr 23 17:03:07.499323 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:07.499105 2580 generic.go:358] "Generic (PLEG): container finished" podID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerID="5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab" exitCode=0 Apr 23 17:03:07.501894 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:07.501092 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" event={"ID":"3197270b-bdb5-4935-a528-aa7103cfa4ad","Type":"ContainerDied","Data":"5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab"} Apr 23 17:03:07.510166 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:07.510136 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" podUID="b1752178-f74d-452b-ad33-b8e28f685826" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.69:15021/healthz/ready\": dial tcp 10.132.0.69:15021: connect: connection refused" Apr 23 17:03:08.505673 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:08.505635 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" event={"ID":"3197270b-bdb5-4935-a528-aa7103cfa4ad","Type":"ContainerStarted","Data":"9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb"} Apr 23 17:03:08.505673 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:08.505674 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" event={"ID":"3197270b-bdb5-4935-a528-aa7103cfa4ad","Type":"ContainerStarted","Data":"23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920"} Apr 23 17:03:08.506145 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:08.505814 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:08.509794 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:08.509770 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:08.510046 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:08.510025 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:08.510707 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:08.510686 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-s5qjx" Apr 23 17:03:08.530544 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:08.530484 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" podStartSLOduration=3.530464525 podStartE2EDuration="3.530464525s" podCreationTimestamp="2026-04-23 17:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:03:08.525168748 +0000 UTC m=+1674.586243006" watchObservedRunningTime="2026-04-23 17:03:08.530464525 +0000 UTC m=+1674.591538780" Apr 23 17:03:10.515759 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:10.515728 2580 generic.go:358] "Generic (PLEG): container finished" podID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerID="0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256" exitCode=0 Apr 23 17:03:10.516153 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:10.515797 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" event={"ID":"b4c156d5-a210-42eb-93ed-57a1a1998a2b","Type":"ContainerDied","Data":"0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256"} Apr 23 17:03:11.522420 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:11.522378 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" event={"ID":"b4c156d5-a210-42eb-93ed-57a1a1998a2b","Type":"ContainerStarted","Data":"595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426"} Apr 23 17:03:11.546115 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:11.546052 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podStartSLOduration=7.546032256 podStartE2EDuration="7.546032256s" podCreationTimestamp="2026-04-23 17:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:03:11.543128598 +0000 UTC m=+1677.604202854" watchObservedRunningTime="2026-04-23 17:03:11.546032256 +0000 UTC m=+1677.607106511" Apr 23 17:03:15.190667 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:15.190616 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:15.191154 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:15.190786 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:15.192199 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:15.192166 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.70:8001/health\": dial tcp 10.132.0.70:8001: connect: connection refused" Apr 23 17:03:15.204407 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:15.204387 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:03:15.885613 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:15.885578 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:15.885613 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:15.885621 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:15.888525 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:15.888500 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:16.548785 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:16.548757 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:25.190888 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:25.190834 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.70:8001/health\": dial tcp 10.132.0.70:8001: connect: connection refused" Apr 23 17:03:32.359229 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.359203 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:03:32.457980 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.457881 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-kserve-provision-location\") pod \"06e48821-aabf-4e37-9252-48411a197de1\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " Apr 23 17:03:32.457980 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.457932 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9mxn\" (UniqueName: \"kubernetes.io/projected/06e48821-aabf-4e37-9252-48411a197de1-kube-api-access-p9mxn\") pod \"06e48821-aabf-4e37-9252-48411a197de1\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " Apr 23 17:03:32.457980 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.457972 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-dshm\") pod \"06e48821-aabf-4e37-9252-48411a197de1\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " Apr 23 17:03:32.458281 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.457998 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/06e48821-aabf-4e37-9252-48411a197de1-tls-certs\") pod \"06e48821-aabf-4e37-9252-48411a197de1\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " Apr 23 17:03:32.458281 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.458056 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-home\") pod \"06e48821-aabf-4e37-9252-48411a197de1\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " Apr 23 17:03:32.458281 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.458082 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-model-cache\") pod \"06e48821-aabf-4e37-9252-48411a197de1\" (UID: \"06e48821-aabf-4e37-9252-48411a197de1\") " Apr 23 17:03:32.458644 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.458615 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-model-cache" (OuterVolumeSpecName: "model-cache") pod "06e48821-aabf-4e37-9252-48411a197de1" (UID: "06e48821-aabf-4e37-9252-48411a197de1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:32.458856 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.458822 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-home" (OuterVolumeSpecName: "home") pod "06e48821-aabf-4e37-9252-48411a197de1" (UID: "06e48821-aabf-4e37-9252-48411a197de1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:32.460274 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.460244 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e48821-aabf-4e37-9252-48411a197de1-kube-api-access-p9mxn" (OuterVolumeSpecName: "kube-api-access-p9mxn") pod "06e48821-aabf-4e37-9252-48411a197de1" (UID: "06e48821-aabf-4e37-9252-48411a197de1"). InnerVolumeSpecName "kube-api-access-p9mxn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:03:32.460657 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.460636 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e48821-aabf-4e37-9252-48411a197de1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "06e48821-aabf-4e37-9252-48411a197de1" (UID: "06e48821-aabf-4e37-9252-48411a197de1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:03:32.460657 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.460646 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-dshm" (OuterVolumeSpecName: "dshm") pod "06e48821-aabf-4e37-9252-48411a197de1" (UID: "06e48821-aabf-4e37-9252-48411a197de1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:32.516816 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.516770 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "06e48821-aabf-4e37-9252-48411a197de1" (UID: "06e48821-aabf-4e37-9252-48411a197de1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:32.559052 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.559022 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.559052 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.559048 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.559052 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.559059 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p9mxn\" (UniqueName: \"kubernetes.io/projected/06e48821-aabf-4e37-9252-48411a197de1-kube-api-access-p9mxn\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.559364 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.559068 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.559364 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.559077 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/06e48821-aabf-4e37-9252-48411a197de1-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.559364 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.559085 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/06e48821-aabf-4e37-9252-48411a197de1-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.617387 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.617349 2580 generic.go:358] "Generic (PLEG): container finished" podID="06e48821-aabf-4e37-9252-48411a197de1" containerID="0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20" exitCode=137 Apr 23 17:03:32.617580 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.617428 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" Apr 23 17:03:32.617580 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.617430 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" event={"ID":"06e48821-aabf-4e37-9252-48411a197de1","Type":"ContainerDied","Data":"0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20"} Apr 23 17:03:32.617580 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.617475 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g" event={"ID":"06e48821-aabf-4e37-9252-48411a197de1","Type":"ContainerDied","Data":"af12d036475833bad9035e19d46388b0c5df681ebdb9e123990d79475e2051f2"} Apr 23 17:03:32.617580 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.617495 2580 scope.go:117] "RemoveContainer" containerID="0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20" Apr 23 17:03:32.637451 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.637417 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g"] Apr 23 17:03:32.639240 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.639110 2580 scope.go:117] "RemoveContainer" containerID="cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830" Apr 23 17:03:32.642628 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.642604 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-676f75cdb7-gwk4g"] Apr 23 17:03:32.698220 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.698194 2580 scope.go:117] "RemoveContainer" containerID="0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20" Apr 23 17:03:32.698583 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:03:32.698560 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20\": container with ID starting with 0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20 not found: ID does not exist" containerID="0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20" Apr 23 17:03:32.698678 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.698599 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20"} err="failed to get container status \"0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20\": rpc error: code = NotFound desc = could not find container \"0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20\": container with ID starting with 0ce3062f3ad6d67cea52120789e6bb16010993eff41dad94465e3d85246b4d20 not found: ID does not exist" Apr 23 17:03:32.698678 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.698625 2580 scope.go:117] "RemoveContainer" containerID="cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830" Apr 23 17:03:32.698947 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:03:32.698926 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830\": container with ID starting with cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830 not found: ID does not exist" containerID="cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830" Apr 23 17:03:32.698997 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:32.698953 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830"} err="failed to get container status \"cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830\": rpc error: code = NotFound desc = could not find container \"cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830\": container with ID starting with cb462b5029554ba2acb93066c302d79eff20f942d0c7e4c2120a59568ac70830 not found: ID does not exist" Apr 23 17:03:34.538386 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:34.538354 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e48821-aabf-4e37-9252-48411a197de1" path="/var/lib/kubelet/pods/06e48821-aabf-4e37-9252-48411a197de1/volumes" Apr 23 17:03:35.190212 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:35.190165 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.70:8001/health\": dial tcp 10.132.0.70:8001: connect: connection refused" Apr 23 17:03:37.553562 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:37.553530 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:03:45.190119 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:45.190068 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.70:8001/health\": dial tcp 10.132.0.70:8001: connect: connection refused" Apr 23 17:03:55.191025 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:03:55.190958 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.70:8001/health\": dial tcp 10.132.0.70:8001: connect: connection refused" Apr 23 17:04:05.190402 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:05.190344 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.70:8001/health\": dial tcp 10.132.0.70:8001: connect: connection refused" Apr 23 17:04:15.190876 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:15.190830 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.70:8001/health\": dial tcp 10.132.0.70:8001: connect: connection refused" Apr 23 17:04:25.190703 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:25.190645 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.70:8001/health\": dial tcp 10.132.0.70:8001: connect: connection refused" Apr 23 17:04:35.190206 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:35.190156 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.70:8001/health\": dial tcp 10.132.0.70:8001: connect: connection refused" Apr 23 17:04:45.205688 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:45.205651 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:04:45.217517 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:45.217491 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:04:56.173264 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:56.173228 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q"] Apr 23 17:04:56.173661 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:56.173589 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerName="main" containerID="cri-o://23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920" gracePeriod=30 Apr 23 17:04:56.173733 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:56.173662 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerName="tokenizer" containerID="cri-o://9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb" gracePeriod=30 Apr 23 17:04:56.178645 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:56.178557 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv"] Apr 23 17:04:56.179010 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:56.178976 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" containerID="cri-o://595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426" gracePeriod=30 Apr 23 17:04:56.548439 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:56.548399 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.71:8082/healthz\": dial tcp 10.132.0.71:8082: connect: connection refused" Apr 23 17:04:56.964149 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:56.964064 2580 generic.go:358] "Generic (PLEG): container finished" podID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerID="23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920" exitCode=0 Apr 23 17:04:56.964312 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:56.964139 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" event={"ID":"3197270b-bdb5-4935-a528-aa7103cfa4ad","Type":"ContainerDied","Data":"23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920"} Apr 23 17:04:57.431811 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.431788 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:04:57.515583 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.515505 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-cache\") pod \"3197270b-bdb5-4935-a528-aa7103cfa4ad\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " Apr 23 17:04:57.515583 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.515551 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-uds\") pod \"3197270b-bdb5-4935-a528-aa7103cfa4ad\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " Apr 23 17:04:57.515583 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.515578 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3197270b-bdb5-4935-a528-aa7103cfa4ad-tls-certs\") pod \"3197270b-bdb5-4935-a528-aa7103cfa4ad\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " Apr 23 17:04:57.516319 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.516045 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-tmp\") pod \"3197270b-bdb5-4935-a528-aa7103cfa4ad\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " Apr 23 17:04:57.516319 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.516132 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-kserve-provision-location\") pod \"3197270b-bdb5-4935-a528-aa7103cfa4ad\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " Apr 23 17:04:57.516319 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.516134 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3197270b-bdb5-4935-a528-aa7103cfa4ad" (UID: "3197270b-bdb5-4935-a528-aa7103cfa4ad"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:04:57.516319 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.516193 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trq59\" (UniqueName: \"kubernetes.io/projected/3197270b-bdb5-4935-a528-aa7103cfa4ad-kube-api-access-trq59\") pod \"3197270b-bdb5-4935-a528-aa7103cfa4ad\" (UID: \"3197270b-bdb5-4935-a528-aa7103cfa4ad\") " Apr 23 17:04:57.516319 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.516195 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3197270b-bdb5-4935-a528-aa7103cfa4ad" (UID: "3197270b-bdb5-4935-a528-aa7103cfa4ad"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:04:57.516770 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.516734 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3197270b-bdb5-4935-a528-aa7103cfa4ad" (UID: "3197270b-bdb5-4935-a528-aa7103cfa4ad"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:04:57.518162 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.517142 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3197270b-bdb5-4935-a528-aa7103cfa4ad" (UID: "3197270b-bdb5-4935-a528-aa7103cfa4ad"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:04:57.518162 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.517265 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:04:57.518162 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.517313 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-uds\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:04:57.518162 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.517338 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-tokenizer-tmp\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:04:57.523869 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.518860 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3197270b-bdb5-4935-a528-aa7103cfa4ad-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3197270b-bdb5-4935-a528-aa7103cfa4ad" (UID: "3197270b-bdb5-4935-a528-aa7103cfa4ad"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:04:57.523869 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.519012 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3197270b-bdb5-4935-a528-aa7103cfa4ad-kube-api-access-trq59" (OuterVolumeSpecName: "kube-api-access-trq59") pod "3197270b-bdb5-4935-a528-aa7103cfa4ad" (UID: "3197270b-bdb5-4935-a528-aa7103cfa4ad"). InnerVolumeSpecName "kube-api-access-trq59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:04:57.618313 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.618268 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3197270b-bdb5-4935-a528-aa7103cfa4ad-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:04:57.618313 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.618313 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3197270b-bdb5-4935-a528-aa7103cfa4ad-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:04:57.618455 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.618324 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trq59\" (UniqueName: \"kubernetes.io/projected/3197270b-bdb5-4935-a528-aa7103cfa4ad-kube-api-access-trq59\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:04:57.970066 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.970027 2580 generic.go:358] "Generic (PLEG): container finished" podID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerID="9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb" exitCode=0 Apr 23 17:04:57.970260 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.970104 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" event={"ID":"3197270b-bdb5-4935-a528-aa7103cfa4ad","Type":"ContainerDied","Data":"9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb"} Apr 23 17:04:57.970260 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.970146 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" event={"ID":"3197270b-bdb5-4935-a528-aa7103cfa4ad","Type":"ContainerDied","Data":"fa46423fe54a6e1ac8354c6055a14ba7bfe7b65ef86c1b7a8df9e2bbec1647aa"} Apr 23 17:04:57.970260 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.970151 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q" Apr 23 17:04:57.970260 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.970162 2580 scope.go:117] "RemoveContainer" containerID="9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb" Apr 23 17:04:57.979991 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.979972 2580 scope.go:117] "RemoveContainer" containerID="23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920" Apr 23 17:04:57.989857 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.989839 2580 scope.go:117] "RemoveContainer" containerID="5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab" Apr 23 17:04:57.994005 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.993984 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q"] Apr 23 17:04:57.997726 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.997705 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-59578dc768zq9q"] Apr 23 17:04:57.999193 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.999179 2580 scope.go:117] "RemoveContainer" containerID="9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb" Apr 23 17:04:57.999462 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:04:57.999441 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb\": container with ID starting with 9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb not found: ID does not exist" containerID="9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb" Apr 23 17:04:57.999532 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.999475 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb"} err="failed to get container status \"9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb\": rpc error: code = NotFound desc = could not find container \"9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb\": container with ID starting with 9dc8a3134c6b79b21fa1911210766ae5e187a4bedd77a1dc88f4df9c17ccc7eb not found: ID does not exist" Apr 23 17:04:57.999532 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.999496 2580 scope.go:117] "RemoveContainer" containerID="23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920" Apr 23 17:04:57.999710 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:04:57.999696 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920\": container with ID starting with 23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920 not found: ID does not exist" containerID="23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920" Apr 23 17:04:57.999752 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.999715 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920"} err="failed to get container status \"23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920\": rpc error: code = NotFound desc = could not find container \"23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920\": container with ID starting with 23a744ec405d741b058573fb34d51878aa8f5cb9ab5800ac8344210480092920 not found: ID does not exist" Apr 23 17:04:57.999752 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.999729 2580 scope.go:117] "RemoveContainer" containerID="5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab" Apr 23 17:04:57.999910 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:04:57.999895 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab\": container with ID starting with 5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab not found: ID does not exist" containerID="5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab" Apr 23 17:04:57.999952 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:57.999916 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab"} err="failed to get container status \"5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab\": rpc error: code = NotFound desc = could not find container \"5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab\": container with ID starting with 5c689740e4333d0e190cdb06db4368180832524e4ebc9f8ad02285e122ff92ab not found: ID does not exist" Apr 23 17:04:58.538021 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:04:58.537985 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" path="/var/lib/kubelet/pods/3197270b-bdb5-4935-a528-aa7103cfa4ad/volumes" Apr 23 17:05:11.275120 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:11.275086 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:11.283078 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:11.283054 2580 ???:1] "http: TLS handshake error from 10.0.136.27:46422: EOF" Apr 23 17:05:11.319768 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:11.319731 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:11.330033 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:11.330011 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:11.342972 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:11.342948 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:12.337564 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:12.337527 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:12.360557 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:12.360531 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:12.367943 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:12.367915 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:12.390504 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:12.390485 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:13.463643 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:13.463615 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:13.484971 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:13.484931 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:13.491720 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:13.491682 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:13.502944 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:13.502922 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:14.461444 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:14.461360 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:14.483399 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:14.483376 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:14.491653 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:14.491631 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:14.505082 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:14.505059 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:14.645797 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:14.645769 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 17:05:14.649595 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:14.649575 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 17:05:15.514525 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:15.514495 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:15.536433 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:15.536404 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:15.543166 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:15.543145 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:15.554467 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:15.554443 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:16.505015 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:16.504985 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:16.526830 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:16.526802 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:16.533155 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:16.533138 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:16.542825 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:16.542804 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:17.498601 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:17.498552 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:17.524596 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:17.524564 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:17.531536 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:17.531512 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:17.542269 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:17.542247 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:18.521484 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:18.521376 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:18.545166 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:18.545143 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:18.551258 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:18.551236 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:18.562034 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:18.562016 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:19.577397 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:19.577367 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:19.600384 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:19.600356 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:19.608902 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:19.608878 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:19.619475 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:19.619452 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:20.619886 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:20.619858 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:20.653919 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:20.653864 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:20.661074 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:20.661051 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:20.672377 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:20.672352 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:21.626788 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:21.626758 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:21.652277 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:21.652248 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:21.659225 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:21.659193 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:21.669717 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:21.669693 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:22.693893 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:22.693862 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:22.715493 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:22.715471 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:22.728260 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:22.728234 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:22.736379 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:22.736360 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:23.672512 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:23.672480 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:23.694366 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:23.694339 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:23.701039 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:23.701011 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:23.712021 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:23.711995 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:24.684718 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:24.684689 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-s5qjx_b1752178-f74d-452b-ad33-b8e28f685826/istio-proxy/0.log" Apr 23 17:05:24.707800 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:24.707770 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:24.714623 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:24.714594 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/llm-d-routing-sidecar/0.log" Apr 23 17:05:24.726541 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:24.726519 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/storage-initializer/0.log" Apr 23 17:05:26.179482 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.179438 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="llm-d-routing-sidecar" containerID="cri-o://14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9" gracePeriod=2 Apr 23 17:05:26.445179 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.445156 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:26.445795 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.445779 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:05:26.467317 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.467257 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-model-cache\") pod \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " Apr 23 17:05:26.467451 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.467332 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kserve-provision-location\") pod \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " Apr 23 17:05:26.467451 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.467385 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-home\") pod \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " Apr 23 17:05:26.467451 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.467437 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c156d5-a210-42eb-93ed-57a1a1998a2b-tls-certs\") pod \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " Apr 23 17:05:26.467634 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.467512 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrkqx\" (UniqueName: \"kubernetes.io/projected/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kube-api-access-rrkqx\") pod \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " Apr 23 17:05:26.467634 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.467539 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-dshm\") pod \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\" (UID: \"b4c156d5-a210-42eb-93ed-57a1a1998a2b\") " Apr 23 17:05:26.467634 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.467561 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-model-cache" (OuterVolumeSpecName: "model-cache") pod "b4c156d5-a210-42eb-93ed-57a1a1998a2b" (UID: "b4c156d5-a210-42eb-93ed-57a1a1998a2b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:05:26.467808 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.467782 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-model-cache\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:05:26.468056 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.468009 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-home" (OuterVolumeSpecName: "home") pod "b4c156d5-a210-42eb-93ed-57a1a1998a2b" (UID: "b4c156d5-a210-42eb-93ed-57a1a1998a2b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:05:26.470482 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.470261 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kube-api-access-rrkqx" (OuterVolumeSpecName: "kube-api-access-rrkqx") pod "b4c156d5-a210-42eb-93ed-57a1a1998a2b" (UID: "b4c156d5-a210-42eb-93ed-57a1a1998a2b"). InnerVolumeSpecName "kube-api-access-rrkqx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:05:26.471817 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.471777 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c156d5-a210-42eb-93ed-57a1a1998a2b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b4c156d5-a210-42eb-93ed-57a1a1998a2b" (UID: "b4c156d5-a210-42eb-93ed-57a1a1998a2b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:05:26.471817 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.471811 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-dshm" (OuterVolumeSpecName: "dshm") pod "b4c156d5-a210-42eb-93ed-57a1a1998a2b" (UID: "b4c156d5-a210-42eb-93ed-57a1a1998a2b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:05:26.538230 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.538195 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b4c156d5-a210-42eb-93ed-57a1a1998a2b" (UID: "b4c156d5-a210-42eb-93ed-57a1a1998a2b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:05:26.568878 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.568836 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rrkqx\" (UniqueName: \"kubernetes.io/projected/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kube-api-access-rrkqx\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:05:26.568878 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.568873 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-dshm\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:05:26.568878 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.568883 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-kserve-provision-location\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:05:26.569109 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.568893 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b4c156d5-a210-42eb-93ed-57a1a1998a2b-home\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:05:26.569109 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:26.568903 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c156d5-a210-42eb-93ed-57a1a1998a2b-tls-certs\") on node \"ip-10-0-128-198.ec2.internal\" DevicePath \"\"" Apr 23 17:05:27.085967 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.085933 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d668c8c46-wlvlv_b4c156d5-a210-42eb-93ed-57a1a1998a2b/main/0.log" Apr 23 17:05:27.086543 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.086517 2580 generic.go:358] "Generic (PLEG): container finished" podID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerID="595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426" exitCode=137 Apr 23 17:05:27.086543 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.086541 2580 generic.go:358] "Generic (PLEG): container finished" podID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerID="14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9" exitCode=0 Apr 23 17:05:27.086657 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.086576 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" event={"ID":"b4c156d5-a210-42eb-93ed-57a1a1998a2b","Type":"ContainerDied","Data":"595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426"} Apr 23 17:05:27.086657 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.086598 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" Apr 23 17:05:27.086657 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.086617 2580 scope.go:117] "RemoveContainer" containerID="595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426" Apr 23 17:05:27.086782 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.086607 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" event={"ID":"b4c156d5-a210-42eb-93ed-57a1a1998a2b","Type":"ContainerDied","Data":"14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9"} Apr 23 17:05:27.086852 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.086771 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv" event={"ID":"b4c156d5-a210-42eb-93ed-57a1a1998a2b","Type":"ContainerDied","Data":"c151214cb5a000fcfad20c50c7fe733e7686c150a7120c4e7b3d16c5c02778e6"} Apr 23 17:05:27.107765 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.107745 2580 scope.go:117] "RemoveContainer" containerID="0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256" Apr 23 17:05:27.113077 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.113053 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv"] Apr 23 17:05:27.118183 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.118160 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d668c8c46-wlvlv"] Apr 23 17:05:27.120397 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.120375 2580 scope.go:117] "RemoveContainer" containerID="14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9" Apr 23 17:05:27.128021 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.127988 2580 scope.go:117] "RemoveContainer" containerID="595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426" Apr 23 17:05:27.128276 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:05:27.128259 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426\": container with ID starting with 595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426 not found: ID does not exist" containerID="595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426" Apr 23 17:05:27.128412 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.128284 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426"} err="failed to get container status \"595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426\": rpc error: code = NotFound desc = could not find container \"595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426\": container with ID starting with 595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426 not found: ID does not exist" Apr 23 17:05:27.128412 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.128351 2580 scope.go:117] "RemoveContainer" containerID="0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256" Apr 23 17:05:27.128607 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:05:27.128584 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256\": container with ID starting with 0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256 not found: ID does not exist" containerID="0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256" Apr 23 17:05:27.128677 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.128616 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256"} err="failed to get container status \"0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256\": rpc error: code = NotFound desc = could not find container \"0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256\": container with ID starting with 0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256 not found: ID does not exist" Apr 23 17:05:27.128677 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.128641 2580 scope.go:117] "RemoveContainer" containerID="14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9" Apr 23 17:05:27.128874 ip-10-0-128-198 kubenswrapper[2580]: E0423 17:05:27.128856 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9\": container with ID starting with 14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9 not found: ID does not exist" containerID="14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9" Apr 23 17:05:27.128925 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.128881 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9"} err="failed to get container status \"14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9\": rpc error: code = NotFound desc = could not find container \"14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9\": container with ID starting with 14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9 not found: ID does not exist" Apr 23 17:05:27.128925 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.128896 2580 scope.go:117] "RemoveContainer" containerID="595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426" Apr 23 17:05:27.129143 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.129124 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426"} err="failed to get container status \"595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426\": rpc error: code = NotFound desc = could not find container \"595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426\": container with ID starting with 595a636bd45d9554c6a15e46ef4d3c1be70153df03c6e38a196092a534f0c426 not found: ID does not exist" Apr 23 17:05:27.129199 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.129143 2580 scope.go:117] "RemoveContainer" containerID="0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256" Apr 23 17:05:27.129389 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.129365 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256"} err="failed to get container status \"0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256\": rpc error: code = NotFound desc = could not find container \"0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256\": container with ID starting with 0697f22c6a846042923d901728f26d8eac550c9b754f8f497a772eb70a9d4256 not found: ID does not exist" Apr 23 17:05:27.129450 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.129390 2580 scope.go:117] "RemoveContainer" containerID="14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9" Apr 23 17:05:27.129636 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.129620 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9"} err="failed to get container status \"14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9\": rpc error: code = NotFound desc = could not find container \"14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9\": container with ID starting with 14e0c7d6dd852966c5eece597e90a5269eaccb1bef0d8d8627a2af825d5b7cf9 not found: ID does not exist" Apr 23 17:05:27.461812 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.461735 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-5xwxm_9d6557a0-dda5-43f0-ad59-78c18ad4d290/manager/0.log" Apr 23 17:05:27.524265 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.524234 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-mqgtx_6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2/manager/0.log" Apr 23 17:05:27.534368 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:27.534342 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-9jtks_b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a/limitador/0.log" Apr 23 17:05:28.537268 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:28.537234 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" path="/var/lib/kubelet/pods/b4c156d5-a210-42eb-93ed-57a1a1998a2b/volumes" Apr 23 17:05:29.853877 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.853843 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2fptc/must-gather-7g78r"] Apr 23 17:05:29.854247 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854219 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" Apr 23 17:05:29.854247 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854229 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" Apr 23 17:05:29.854247 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854243 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="storage-initializer" Apr 23 17:05:29.854247 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854248 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="storage-initializer" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854257 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="llm-d-routing-sidecar" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854263 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="llm-d-routing-sidecar" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854275 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854280 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854305 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="storage-initializer" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854311 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="storage-initializer" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854321 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerName="main" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854326 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerName="main" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854332 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerName="tokenizer" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854336 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerName="tokenizer" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854344 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerName="storage-initializer" Apr 23 17:05:29.854398 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854349 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerName="storage-initializer" Apr 23 17:05:29.854747 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854407 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="06e48821-aabf-4e37-9252-48411a197de1" containerName="main" Apr 23 17:05:29.854747 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854415 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="main" Apr 23 17:05:29.854747 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854422 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerName="tokenizer" Apr 23 17:05:29.854747 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854430 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="3197270b-bdb5-4935-a528-aa7103cfa4ad" containerName="main" Apr 23 17:05:29.854747 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.854439 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4c156d5-a210-42eb-93ed-57a1a1998a2b" containerName="llm-d-routing-sidecar" Apr 23 17:05:29.859701 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.859682 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fptc/must-gather-7g78r" Apr 23 17:05:29.862644 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.862618 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2fptc\"/\"default-dockercfg-wnz7f\"" Apr 23 17:05:29.862772 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.862667 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fptc\"/\"openshift-service-ca.crt\"" Apr 23 17:05:29.862772 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.862682 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fptc\"/\"kube-root-ca.crt\"" Apr 23 17:05:29.864741 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.864712 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fptc/must-gather-7g78r"] Apr 23 17:05:29.897646 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.897610 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0f924587-678d-4214-b612-5bb8beb226d8-must-gather-output\") pod \"must-gather-7g78r\" (UID: \"0f924587-678d-4214-b612-5bb8beb226d8\") " pod="openshift-must-gather-2fptc/must-gather-7g78r" Apr 23 17:05:29.897810 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.897654 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc47f\" (UniqueName: \"kubernetes.io/projected/0f924587-678d-4214-b612-5bb8beb226d8-kube-api-access-zc47f\") pod \"must-gather-7g78r\" (UID: \"0f924587-678d-4214-b612-5bb8beb226d8\") " pod="openshift-must-gather-2fptc/must-gather-7g78r" Apr 23 17:05:29.998839 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.998804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0f924587-678d-4214-b612-5bb8beb226d8-must-gather-output\") pod \"must-gather-7g78r\" (UID: \"0f924587-678d-4214-b612-5bb8beb226d8\") " pod="openshift-must-gather-2fptc/must-gather-7g78r" Apr 23 17:05:29.999011 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.998848 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zc47f\" (UniqueName: \"kubernetes.io/projected/0f924587-678d-4214-b612-5bb8beb226d8-kube-api-access-zc47f\") pod \"must-gather-7g78r\" (UID: \"0f924587-678d-4214-b612-5bb8beb226d8\") " pod="openshift-must-gather-2fptc/must-gather-7g78r" Apr 23 17:05:29.999151 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:29.999131 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0f924587-678d-4214-b612-5bb8beb226d8-must-gather-output\") pod \"must-gather-7g78r\" (UID: \"0f924587-678d-4214-b612-5bb8beb226d8\") " pod="openshift-must-gather-2fptc/must-gather-7g78r" Apr 23 17:05:30.007547 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:30.007525 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc47f\" (UniqueName: \"kubernetes.io/projected/0f924587-678d-4214-b612-5bb8beb226d8-kube-api-access-zc47f\") pod \"must-gather-7g78r\" (UID: \"0f924587-678d-4214-b612-5bb8beb226d8\") " pod="openshift-must-gather-2fptc/must-gather-7g78r" Apr 23 17:05:30.169157 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:30.169064 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fptc/must-gather-7g78r" Apr 23 17:05:30.296439 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:30.296414 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fptc/must-gather-7g78r"] Apr 23 17:05:30.298326 ip-10-0-128-198 kubenswrapper[2580]: W0423 17:05:30.298265 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f924587_678d_4214_b612_5bb8beb226d8.slice/crio-00c9947eabd172b37bcc3dc7421bc7e3861a73ac33fbdea5caf4e59f22b4c8ae WatchSource:0}: Error finding container 00c9947eabd172b37bcc3dc7421bc7e3861a73ac33fbdea5caf4e59f22b4c8ae: Status 404 returned error can't find the container with id 00c9947eabd172b37bcc3dc7421bc7e3861a73ac33fbdea5caf4e59f22b4c8ae Apr 23 17:05:31.103883 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:31.103835 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fptc/must-gather-7g78r" event={"ID":"0f924587-678d-4214-b612-5bb8beb226d8","Type":"ContainerStarted","Data":"00c9947eabd172b37bcc3dc7421bc7e3861a73ac33fbdea5caf4e59f22b4c8ae"} Apr 23 17:05:32.109565 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:32.109526 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fptc/must-gather-7g78r" event={"ID":"0f924587-678d-4214-b612-5bb8beb226d8","Type":"ContainerStarted","Data":"54e0158d15d3e3cc469554f9ea7d306c046a995354da33594bbb4cb98959e480"} Apr 23 17:05:32.109943 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:32.109567 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fptc/must-gather-7g78r" event={"ID":"0f924587-678d-4214-b612-5bb8beb226d8","Type":"ContainerStarted","Data":"f75d9d54a8f07408c971fdc26461f08cf8d0269aac65eef61f01a348c00de75f"} Apr 23 17:05:32.129002 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:32.128937 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2fptc/must-gather-7g78r" podStartSLOduration=2.259298806 podStartE2EDuration="3.12891658s" podCreationTimestamp="2026-04-23 17:05:29 +0000 UTC" firstStartedPulling="2026-04-23 17:05:30.300011206 +0000 UTC m=+1816.361085442" lastFinishedPulling="2026-04-23 17:05:31.169628981 +0000 UTC m=+1817.230703216" observedRunningTime="2026-04-23 17:05:32.12638931 +0000 UTC m=+1818.187463578" watchObservedRunningTime="2026-04-23 17:05:32.12891658 +0000 UTC m=+1818.189990835" Apr 23 17:05:32.790461 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:32.790429 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tr482_047666b9-5e8b-4117-8317-ca917bf89757/global-pull-secret-syncer/0.log" Apr 23 17:05:32.894236 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:32.894198 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q8rd7_268f3349-6678-43d5-8596-698c807f908a/konnectivity-agent/0.log" Apr 23 17:05:32.910634 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:32.910598 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-198.ec2.internal_49434315036770a524de2f8664b84004/haproxy/0.log" Apr 23 17:05:36.090889 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:36.090852 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-5xwxm_9d6557a0-dda5-43f0-ad59-78c18ad4d290/manager/0.log" Apr 23 17:05:36.186069 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:36.186015 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-mqgtx_6fc9d8b5-302d-46a4-acf1-8e2b2d75a8b2/manager/0.log" Apr 23 17:05:36.209611 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:36.209572 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-9jtks_b1a52c1c-80a1-4ed7-9fb4-6529bd1c776a/limitador/0.log" Apr 23 17:05:37.987371 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:37.987340 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-58gvv_86ae9c55-d565-4339-b6c1-9cb99a952b70/monitoring-plugin/0.log" Apr 23 17:05:38.082230 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.082192 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bx66h_5fc385bb-6168-4cf7-9f6f-3a7d17005621/node-exporter/0.log" Apr 23 17:05:38.101798 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.101767 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bx66h_5fc385bb-6168-4cf7-9f6f-3a7d17005621/kube-rbac-proxy/0.log" Apr 23 17:05:38.119211 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.119185 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bx66h_5fc385bb-6168-4cf7-9f6f-3a7d17005621/init-textfile/0.log" Apr 23 17:05:38.204923 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.204895 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lkkzb_61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d/kube-rbac-proxy-main/0.log" Apr 23 17:05:38.225664 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.225632 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lkkzb_61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d/kube-rbac-proxy-self/0.log" Apr 23 17:05:38.248148 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.248051 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lkkzb_61ecdfb7-6d32-488b-b3b1-99d8f7adbe0d/openshift-state-metrics/0.log" Apr 23 17:05:38.453225 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.453190 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-2svfk_5e762e20-55d7-4136-9407-bb91ee9cadca/prometheus-operator-admission-webhook/0.log" Apr 23 17:05:38.604305 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.604193 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f7784b7cd-krvh2_3cbcad22-6cfd-4816-a73c-152549b91eeb/thanos-query/0.log" Apr 23 17:05:38.635597 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.635563 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f7784b7cd-krvh2_3cbcad22-6cfd-4816-a73c-152549b91eeb/kube-rbac-proxy-web/0.log" Apr 23 17:05:38.662101 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.662065 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f7784b7cd-krvh2_3cbcad22-6cfd-4816-a73c-152549b91eeb/kube-rbac-proxy/0.log" Apr 23 17:05:38.683910 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.683881 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f7784b7cd-krvh2_3cbcad22-6cfd-4816-a73c-152549b91eeb/prom-label-proxy/0.log" Apr 23 17:05:38.713249 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.713213 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f7784b7cd-krvh2_3cbcad22-6cfd-4816-a73c-152549b91eeb/kube-rbac-proxy-rules/0.log" Apr 23 17:05:38.737670 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:38.737605 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f7784b7cd-krvh2_3cbcad22-6cfd-4816-a73c-152549b91eeb/kube-rbac-proxy-metrics/0.log" Apr 23 17:05:41.325645 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.325615 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d4d9b9949-zcf8z_6e1a41ee-fc33-41dd-953f-d37e45dacbee/console/0.log" Apr 23 17:05:41.356715 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.356685 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-ph9nc_1148e1b2-bfa3-471a-876a-52bc12750931/download-server/0.log" Apr 23 17:05:41.727151 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.727104 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6"] Apr 23 17:05:41.734440 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.734412 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.741563 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.741534 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6"] Apr 23 17:05:41.814721 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.814664 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-sys\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.815041 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.815018 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-proc\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.815264 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.815245 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-lib-modules\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.816045 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.816022 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltn9r\" (UniqueName: \"kubernetes.io/projected/ac59a511-83c2-4933-8731-1e4fa8f770b8-kube-api-access-ltn9r\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.816278 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.816252 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-podres\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.917480 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.917435 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltn9r\" (UniqueName: \"kubernetes.io/projected/ac59a511-83c2-4933-8731-1e4fa8f770b8-kube-api-access-ltn9r\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.917765 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.917743 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-podres\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.917968 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.917936 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-sys\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.918075 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.918052 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-sys\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.918207 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.918189 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-podres\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.918327 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.918307 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-proc\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.918473 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.918452 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-lib-modules\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.918618 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.918589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-lib-modules\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.918688 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.918349 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ac59a511-83c2-4933-8731-1e4fa8f770b8-proc\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:41.927645 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:41.927620 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltn9r\" (UniqueName: \"kubernetes.io/projected/ac59a511-83c2-4933-8731-1e4fa8f770b8-kube-api-access-ltn9r\") pod \"perf-node-gather-daemonset-7wgq6\" (UID: \"ac59a511-83c2-4933-8731-1e4fa8f770b8\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:42.048343 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:42.047853 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:42.201535 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:42.201503 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6"] Apr 23 17:05:42.205330 ip-10-0-128-198 kubenswrapper[2580]: W0423 17:05:42.203857 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podac59a511_83c2_4933_8731_1e4fa8f770b8.slice/crio-c8f5f89167e9408e88be74e5f4cf532bf66914e0d545b8fcf38135089793ad6d WatchSource:0}: Error finding container c8f5f89167e9408e88be74e5f4cf532bf66914e0d545b8fcf38135089793ad6d: Status 404 returned error can't find the container with id c8f5f89167e9408e88be74e5f4cf532bf66914e0d545b8fcf38135089793ad6d Apr 23 17:05:42.554520 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:42.554491 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4vzw2_0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4/dns/0.log" Apr 23 17:05:42.571695 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:42.571613 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4vzw2_0cdc9ee5-b02e-4e33-a558-3cb94cae3fe4/kube-rbac-proxy/0.log" Apr 23 17:05:42.703386 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:42.703354 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w5pdt_5da0035a-7e6e-4e50-9404-1dde996e4313/dns-node-resolver/0.log" Apr 23 17:05:43.168940 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:43.168903 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" event={"ID":"ac59a511-83c2-4933-8731-1e4fa8f770b8","Type":"ContainerStarted","Data":"c5e22b621e8580b802e26a21855fcf21e8b8f09f7ca6d3e9f1437ae5fb68f437"} Apr 23 17:05:43.168940 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:43.168942 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" event={"ID":"ac59a511-83c2-4933-8731-1e4fa8f770b8","Type":"ContainerStarted","Data":"c8f5f89167e9408e88be74e5f4cf532bf66914e0d545b8fcf38135089793ad6d"} Apr 23 17:05:43.169180 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:43.169024 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:43.173703 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:43.173682 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-z94bz_8e6b0d15-ef44-4cfa-b68b-1d1110a5e3b4/node-ca/0.log" Apr 23 17:05:43.187553 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:43.187506 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" podStartSLOduration=2.18749311 podStartE2EDuration="2.18749311s" podCreationTimestamp="2026-04-23 17:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:05:43.184966668 +0000 UTC m=+1829.246040924" watchObservedRunningTime="2026-04-23 17:05:43.18749311 +0000 UTC m=+1829.248567427" Apr 23 17:05:44.510248 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:44.510215 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zbx8c_e8a8577c-7241-452c-bed6-2d35076dce94/serve-healthcheck-canary/0.log" Apr 23 17:05:44.926173 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:44.926097 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g4cmx_4f325d8f-ab8b-445f-8060-6728f5736741/kube-rbac-proxy/0.log" Apr 23 17:05:44.943972 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:44.943941 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g4cmx_4f325d8f-ab8b-445f-8060-6728f5736741/exporter/0.log" Apr 23 17:05:44.961155 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:44.961130 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g4cmx_4f325d8f-ab8b-445f-8060-6728f5736741/extractor/0.log" Apr 23 17:05:47.630902 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:47.630874 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-wblb7_5064328f-c6a1-4ada-86d4-02b842b09a19/openshift-lws-operator/0.log" Apr 23 17:05:48.111897 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:48.111868 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-5b898d7b9d-8z97z_b576b0bb-3d65-4229-8396-0f196cdbd516/manager/0.log" Apr 23 17:05:48.207335 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:48.207282 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-xktt4_897c34ed-98e1-4d45-85a6-ca0ce1c2c2bf/server/0.log" Apr 23 17:05:48.415899 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:48.415825 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-5kjzr_bf739949-4ce7-4c9b-a0c6-5a25705080da/manager/0.log" Apr 23 17:05:48.461798 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:48.461767 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-rzp4s_fef1709f-12fd-47ac-a6f6-7bf5822b7678/s3-init/0.log" Apr 23 17:05:48.480129 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:48.480104 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-rkbz5_1d6b432c-ebcd-4f32-8197-5a410b0ef10c/seaweedfs/0.log" Apr 23 17:05:49.183989 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:49.183962 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-7wgq6" Apr 23 17:05:53.256067 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:53.256034 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-77tn6_046b94e7-f586-4a7f-b68c-ca66c74a0ab6/kube-storage-version-migrator-operator/1.log" Apr 23 17:05:53.258577 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:53.258549 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-77tn6_046b94e7-f586-4a7f-b68c-ca66c74a0ab6/kube-storage-version-migrator-operator/0.log" Apr 23 17:05:54.445049 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:54.445018 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqfht_ee57872d-b83c-49ea-b226-5322cb6d1db3/kube-multus-additional-cni-plugins/0.log" Apr 23 17:05:54.465104 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:54.465082 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqfht_ee57872d-b83c-49ea-b226-5322cb6d1db3/egress-router-binary-copy/0.log" Apr 23 17:05:54.481794 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:54.481771 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqfht_ee57872d-b83c-49ea-b226-5322cb6d1db3/cni-plugins/0.log" Apr 23 17:05:54.498079 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:54.498051 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqfht_ee57872d-b83c-49ea-b226-5322cb6d1db3/bond-cni-plugin/0.log" Apr 23 17:05:54.514888 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:54.514860 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqfht_ee57872d-b83c-49ea-b226-5322cb6d1db3/routeoverride-cni/0.log" Apr 23 17:05:54.543904 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:54.543864 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqfht_ee57872d-b83c-49ea-b226-5322cb6d1db3/whereabouts-cni-bincopy/0.log" Apr 23 17:05:54.559509 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:54.559476 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqfht_ee57872d-b83c-49ea-b226-5322cb6d1db3/whereabouts-cni/0.log" Apr 23 17:05:54.627881 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:54.627839 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xs2mg_b70dcd33-861c-4a46-8752-421c750db1ff/kube-multus/0.log" Apr 23 17:05:54.648324 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:54.648287 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5ps7g_4033b659-eaae-4ad3-a8a3-523bdf5fcf89/network-metrics-daemon/0.log" Apr 23 17:05:54.662684 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:54.662659 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5ps7g_4033b659-eaae-4ad3-a8a3-523bdf5fcf89/kube-rbac-proxy/0.log" Apr 23 17:05:55.863985 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:55.863939 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-controller/0.log" Apr 23 17:05:55.877168 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:55.877139 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/0.log" Apr 23 17:05:55.886365 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:55.886343 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovn-acl-logging/1.log" Apr 23 17:05:55.907950 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:55.907924 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/kube-rbac-proxy-node/0.log" Apr 23 17:05:55.928181 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:55.928158 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:05:55.962881 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:55.962859 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/northd/0.log" Apr 23 17:05:55.982418 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:55.982392 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/nbdb/0.log" Apr 23 17:05:56.001679 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:56.001653 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/sbdb/0.log" Apr 23 17:05:56.240932 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:56.240900 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfkqz_5949893b-cd3d-46d5-b194-4ef1ad542b81/ovnkube-controller/0.log" Apr 23 17:05:57.538150 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:57.538112 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-kh9hh_3c0ebdcb-a7e8-4e29-a486-52aed308cf33/network-check-target-container/0.log" Apr 23 17:05:58.564168 ip-10-0-128-198 kubenswrapper[2580]: I0423 17:05:58.564138 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rflcs_398390dc-662b-42ee-b57e-22175922f0ac/iptables-alerter/0.log"