Apr 16 16:48:02.658342 ip-10-0-137-126 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:48:03.299477 ip-10-0-137-126 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:48:03.299477 ip-10-0-137-126 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:48:03.299477 ip-10-0-137-126 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:48:03.299477 ip-10-0-137-126 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:48:03.299477 ip-10-0-137-126 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:48:03.302817 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.302729 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:48:03.309728 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309713 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:48:03.309728 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309728 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309732 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309736 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309739 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309742 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309747 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309750 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309753 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309756 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309759 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309762 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309764 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309768 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309771 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309773 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309776 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309778 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309781 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309784 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309787 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:48:03.309793 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309790 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309792 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309795 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309798 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309801 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309804 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309807 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309810 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309812 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309815 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309818 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309820 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309823 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309826 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309829 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309832 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309834 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309837 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309839 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309842 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:48:03.310292 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309844 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309847 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309850 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309852 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309855 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309857 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309860 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309862 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309865 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309868 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309870 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309873 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309876 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309879 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309883 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309885 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309888 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309891 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309894 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309897 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:48:03.310822 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309899 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309902 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309905 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309908 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309911 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309916 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309919 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309922 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309925 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309928 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309931 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309934 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309936 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309939 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309942 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309945 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309948 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309951 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309954 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:48:03.311323 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309956 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309959 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309962 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309965 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309967 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.309971 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310407 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310416 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310419 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310422 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310425 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310429 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310432 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310434 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310437 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310440 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310442 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310445 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310448 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:48:03.311778 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310450 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310453 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310456 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310458 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310461 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310464 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310466 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310469 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310472 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310474 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310477 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310480 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310483 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310485 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310488 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310491 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310493 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310496 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310499 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310501 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:48:03.312261 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310504 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310507 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310510 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310512 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310515 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310518 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310521 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310523 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310526 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310530 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310534 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310536 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310541 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310544 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310547 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310549 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310552 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310555 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310558 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310561 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:48:03.312757 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310564 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310567 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310570 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310572 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310575 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310578 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310580 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310583 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310586 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310589 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310592 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310595 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310598 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310603 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310607 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310610 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310613 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310615 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310618 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:48:03.313296 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310621 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310624 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310627 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310629 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310632 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310635 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310638 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310641 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310643 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310645 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310648 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310651 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310653 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.310656 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311690 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311700 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311707 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311711 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311716 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311719 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311723 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:48:03.313759 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311728 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311731 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311735 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311738 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311742 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311745 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311748 2572 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311751 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311754 2572 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311757 2572 flags.go:64] FLAG: --cloud-config="" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311760 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311764 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311771 2572 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311774 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311777 2572 flags.go:64] FLAG: --config-dir="" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311780 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311785 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311789 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311792 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311795 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311799 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311802 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311805 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311808 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311811 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:48:03.314293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311814 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311819 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311822 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311825 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311828 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311831 2572 flags.go:64] FLAG: --enable-server="true" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311834 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311839 2572 flags.go:64] FLAG: --event-burst="100" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311841 2572 flags.go:64] FLAG: --event-qps="50" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311845 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311848 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311851 2572 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311855 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311858 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311861 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311864 2572 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311867 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311870 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311873 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311877 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311880 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311883 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311886 2572 flags.go:64] FLAG: --feature-gates="" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311890 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311893 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:48:03.314905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311896 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311899 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311902 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311906 2572 flags.go:64] FLAG: --help="false" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311909 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311912 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311915 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311918 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311921 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311925 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311928 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311931 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311934 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311937 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311940 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311944 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311946 2572 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311949 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311952 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311956 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311958 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311961 2572 flags.go:64] FLAG: --lock-file="" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311964 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311967 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:48:03.315541 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311970 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311976 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311979 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311982 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311985 2572 flags.go:64] FLAG: --logging-format="text" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311989 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311992 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311995 2572 flags.go:64] FLAG: --manifest-url="" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.311998 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312002 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312005 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312010 2572 flags.go:64] FLAG: --max-pods="110" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312013 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312016 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312019 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312023 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312026 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312029 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312032 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312039 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312042 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312045 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312049 2572 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:48:03.316134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312052 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312057 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312071 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312074 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312077 2572 flags.go:64] FLAG: --port="10250" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312080 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312097 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01c4720a4ce471efb" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312102 2572 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312105 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312108 2572 flags.go:64] FLAG: --register-node="true" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312112 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312115 2572 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312119 2572 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312123 2572 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312126 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312129 2572 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312132 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312135 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312139 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312142 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312145 2572 flags.go:64] FLAG: --runonce="false" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312147 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312151 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312154 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312157 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312160 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:48:03.316683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312163 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312166 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312170 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312173 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312176 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312179 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312182 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312185 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312189 2572 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312191 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312197 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312200 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312203 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312207 2572 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312210 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312213 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312216 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312219 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312222 2572 flags.go:64] FLAG: --v="2" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312227 2572 flags.go:64] FLAG: --version="false" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312231 2572 flags.go:64] FLAG: --vmodule="" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312235 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.312239 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312329 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:48:03.317329 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312333 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312336 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312339 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312342 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312345 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312348 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312351 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312354 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312356 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312359 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312362 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312365 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312369 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312372 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312376 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312380 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312383 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312385 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312388 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312391 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:48:03.317903 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312393 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312396 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312398 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312401 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312404 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312408 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312411 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312414 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312416 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312420 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312422 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312425 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312428 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312430 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312433 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312435 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312438 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312440 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312443 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:48:03.318463 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312445 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312448 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312450 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312453 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312455 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312460 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312462 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312465 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312468 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312470 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312473 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312475 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312478 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312481 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312483 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312486 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312488 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312491 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312494 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312496 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:48:03.318941 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312499 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312501 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312504 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312507 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312510 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312512 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312514 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312517 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312519 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312522 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312524 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312527 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312530 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312532 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312535 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312537 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312540 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312544 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312547 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312549 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:48:03.319455 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312552 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:48:03.319953 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312555 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:48:03.319953 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312557 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:48:03.319953 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312560 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:48:03.319953 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312562 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:48:03.319953 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.312565 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:48:03.319953 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.313668 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:48:03.321677 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.321659 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:48:03.321714 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.321678 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:48:03.321860 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.321841 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:48:03.321860 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.321857 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:48:03.321860 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.321862 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:48:03.321860 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.321868 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:48:03.323776 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323756 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:48:03.323776 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323774 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:48:03.323776 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323783 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323789 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323793 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323799 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323804 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323808 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323813 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323817 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323822 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323827 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323836 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323840 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323845 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323850 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323854 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323859 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323863 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323868 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323872 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323876 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:48:03.323956 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323883 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323888 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323892 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323897 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323901 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323903 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323906 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.323927 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324103 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324114 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324117 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324121 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324124 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324127 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324129 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324133 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324136 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324142 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324147 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:48:03.324495 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324150 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324154 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324157 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324160 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324163 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324167 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324170 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324173 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324176 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324178 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324182 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324184 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324187 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324190 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324193 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324196 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324198 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324201 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324203 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324206 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:48:03.324974 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324209 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324212 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324215 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324217 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324220 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324222 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324225 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324228 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324231 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324233 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324236 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324238 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324241 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324244 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324246 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324249 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324252 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324255 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324258 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324260 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:48:03.325485 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324263 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.324269 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324372 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324376 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324380 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324383 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324386 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324388 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324391 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324394 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324397 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324400 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324404 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324408 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324411 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324414 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:48:03.326052 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324417 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324420 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324422 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324425 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324428 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324431 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324434 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324436 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324439 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324441 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324444 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324446 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324449 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324451 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324454 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324456 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324459 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324461 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324464 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:48:03.326648 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324467 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324469 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324472 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324475 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324477 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324480 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324482 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324485 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324487 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324490 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324492 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324495 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324498 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324501 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324505 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324509 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324512 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324515 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324518 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:48:03.327130 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324521 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324523 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324526 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324529 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324532 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324535 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324538 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324541 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324543 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324546 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324549 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324552 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324555 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324557 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324560 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324562 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324565 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324567 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324570 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324573 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:48:03.327593 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324575 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324578 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324580 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324583 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324586 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324588 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324591 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324593 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324596 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324598 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324601 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324604 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324607 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:03.324609 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.324614 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:48:03.328095 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.325304 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:48:03.328469 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.327407 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:48:03.328529 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.328517 2572 server.go:1019] "Starting client certificate rotation" Apr 16 16:48:03.328630 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.328615 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:48:03.328663 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.328657 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:48:03.361894 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.361875 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:48:03.369770 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.369745 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:48:03.392780 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.392759 2572 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:48:03.395105 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.395089 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:48:03.400462 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.400448 2572 log.go:25] "Validated CRI v1 image API" Apr 16 16:48:03.401919 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.401903 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:48:03.405721 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.405700 2572 fs.go:135] Filesystem UUIDs: map[158171ce-9320-4328-9480-0a62882e2606:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 cef0de08-16c3-4a92-9283-9ebc53d180bd:/dev/nvme0n1p3] Apr 16 16:48:03.405784 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.405721 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:48:03.411659 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.411544 2572 manager.go:217] Machine: {Timestamp:2026-04-16 16:48:03.410005905 +0000 UTC m=+0.578717276 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096484 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c115d7841a363debb43e85d8ec578 SystemUUID:ec2c115d-7841-a363-debb-43e85d8ec578 BootID:4667e21b-9b99-4f98-8408-f426ceeda200 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f3:36:fc:7f:59 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f3:36:fc:7f:59 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4a:cd:40:4e:5a:72 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:48:03.411659 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.411649 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:48:03.411796 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.411731 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:48:03.413511 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.413490 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:48:03.413654 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.413513 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-126.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:48:03.414403 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.414392 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:48:03.414442 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.414406 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:48:03.414442 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.414420 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:48:03.415752 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.415742 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:48:03.416641 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.416627 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2jxqt" Apr 16 16:48:03.417335 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.417324 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:48:03.417437 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.417428 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:48:03.421446 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.421436 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:48:03.421484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.421450 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:48:03.421484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.421462 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:48:03.421484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.421471 2572 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:48:03.421484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.421481 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:48:03.422543 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.422533 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:48:03.422593 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.422551 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:48:03.425856 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.425836 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2jxqt" Apr 16 16:48:03.425915 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.425894 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:48:03.428008 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.427995 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:48:03.430217 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430203 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:48:03.430217 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430220 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:48:03.430310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430226 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:48:03.430310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430233 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:48:03.430310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430238 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:48:03.430310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430244 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:48:03.430310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430249 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:48:03.430310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430255 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:48:03.430310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430262 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:48:03.430310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430268 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:48:03.430310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430282 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:48:03.430310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.430290 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:48:03.431816 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.431807 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:48:03.431816 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.431816 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:48:03.435629 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.435615 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:48:03.435709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.435654 2572 server.go:1295] "Started kubelet" Apr 16 16:48:03.436555 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.436505 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:48:03.436649 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.436538 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:48:03.436649 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.436577 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:48:03.436565 ip-10-0-137-126 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:48:03.438160 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.438132 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:48:03.438950 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.438935 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:03.439009 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.438970 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:48:03.442006 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.441990 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:03.443750 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.443728 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-126.ec2.internal" not found Apr 16 16:48:03.444364 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.444347 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:48:03.445010 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:03.444990 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:48:03.445175 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445162 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:48:03.445691 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445676 2572 factory.go:55] Registering systemd factory Apr 16 16:48:03.445691 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445691 2572 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:48:03.445893 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445880 2572 factory.go:153] Registering CRI-O factory Apr 16 16:48:03.445954 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445895 2572 factory.go:223] Registration of the crio container factory successfully Apr 16 16:48:03.445954 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445899 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:48:03.445954 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445900 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:48:03.445954 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445920 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:48:03.446137 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445963 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:48:03.446137 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445971 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:48:03.446137 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445979 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:48:03.446137 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.445987 2572 factory.go:103] Registering Raw factory Apr 16 16:48:03.446137 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.446001 2572 manager.go:1196] Started watching for new ooms in manager Apr 16 16:48:03.447335 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:03.447299 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-126.ec2.internal\" not found" Apr 16 16:48:03.447575 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.447553 2572 manager.go:319] Starting recovery of all containers Apr 16 16:48:03.447973 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.447949 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:03.453856 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:03.453834 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-126.ec2.internal\" not found" node="ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.460608 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.460454 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-126.ec2.internal" not found Apr 16 16:48:03.460691 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.460562 2572 manager.go:324] Recovery completed Apr 16 16:48:03.464680 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.464668 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:48:03.466617 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.466601 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-126.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:48:03.466683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.466629 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:48:03.466683 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.466643 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-126.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:48:03.467110 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.467089 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:48:03.467110 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.467101 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:48:03.467209 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.467116 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:48:03.469939 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.469923 2572 policy_none.go:49] "None policy: Start" Apr 16 16:48:03.469939 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.469939 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:48:03.470038 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.469949 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:48:03.502650 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.502637 2572 manager.go:341] "Starting Device Plugin manager" Apr 16 16:48:03.516915 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:03.502663 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:48:03.516915 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.502672 2572 server.go:85] "Starting device plugin registration server" Apr 16 16:48:03.516915 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.502879 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:48:03.516915 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.502892 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:48:03.516915 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.502973 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:48:03.516915 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.503039 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:48:03.516915 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.503048 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:48:03.516915 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:03.503622 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:48:03.516915 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:03.503661 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-126.ec2.internal\" not found" Apr 16 16:48:03.520233 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.520219 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-126.ec2.internal" not found Apr 16 16:48:03.584218 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.584166 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:48:03.585329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.585316 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:48:03.585395 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.585340 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:48:03.585395 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.585354 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:48:03.585395 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.585362 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:48:03.585395 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:03.585392 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:48:03.590356 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.590325 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:03.603714 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.603698 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:48:03.604942 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.604926 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-126.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:48:03.605018 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.604954 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:48:03.605018 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.604968 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-126.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:48:03.605018 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.604987 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.614707 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.614691 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.685812 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.685787 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-126.ec2.internal"] Apr 16 16:48:03.688625 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.688611 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.688625 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.688619 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.712598 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.712580 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.717129 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.717117 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.727936 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.727923 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:48:03.728260 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.728246 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:48:03.747383 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.747360 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/05d53b9573a1962a2ec184a8dbb43318-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal\" (UID: \"05d53b9573a1962a2ec184a8dbb43318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.747484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.747386 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05d53b9573a1962a2ec184a8dbb43318-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal\" (UID: \"05d53b9573a1962a2ec184a8dbb43318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.747484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.747405 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a676821af174364e03710748c5f10fbf-config\") pod \"kube-apiserver-proxy-ip-10-0-137-126.ec2.internal\" (UID: \"a676821af174364e03710748c5f10fbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.848015 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.847937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/05d53b9573a1962a2ec184a8dbb43318-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal\" (UID: \"05d53b9573a1962a2ec184a8dbb43318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.848015 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.847967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05d53b9573a1962a2ec184a8dbb43318-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal\" (UID: \"05d53b9573a1962a2ec184a8dbb43318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.848015 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.847983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a676821af174364e03710748c5f10fbf-config\") pod \"kube-apiserver-proxy-ip-10-0-137-126.ec2.internal\" (UID: \"a676821af174364e03710748c5f10fbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.848015 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.848013 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a676821af174364e03710748c5f10fbf-config\") pod \"kube-apiserver-proxy-ip-10-0-137-126.ec2.internal\" (UID: \"a676821af174364e03710748c5f10fbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.848257 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.848038 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05d53b9573a1962a2ec184a8dbb43318-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal\" (UID: \"05d53b9573a1962a2ec184a8dbb43318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" Apr 16 16:48:03.848257 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:03.848042 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/05d53b9573a1962a2ec184a8dbb43318-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal\" (UID: \"05d53b9573a1962a2ec184a8dbb43318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" Apr 16 16:48:04.030814 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.030776 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" Apr 16 16:48:04.033860 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.033308 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-126.ec2.internal" Apr 16 16:48:04.329236 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.329206 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:48:04.330055 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.329328 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:48:04.330055 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.329356 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:48:04.330055 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.329372 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:48:04.422293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.422267 2572 apiserver.go:52] "Watching apiserver" Apr 16 16:48:04.427822 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.427782 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:43:03 +0000 UTC" deadline="2027-11-05 21:22:04.550983869 +0000 UTC" Apr 16 16:48:04.427822 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.427812 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13636h34m0.123174347s" Apr 16 16:48:04.432573 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.432554 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:48:04.432890 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.432868 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-p5bf5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal","openshift-multus/network-metrics-daemon-dv6f9","openshift-network-diagnostics/network-check-target-s7xbf","openshift-network-operator/iptables-alerter-bpcgz","openshift-ovn-kubernetes/ovnkube-node-brhp4","kube-system/konnectivity-agent-4chxn","kube-system/kube-apiserver-proxy-ip-10-0-137-126.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb","openshift-cluster-node-tuning-operator/tuned-kkn6k","openshift-image-registry/node-ca-9tnfz","openshift-multus/multus-8jhjk","openshift-multus/multus-additional-cni-plugins-92gx2"] Apr 16 16:48:04.436143 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.436123 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p5bf5" Apr 16 16:48:04.436232 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.436150 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:04.436341 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:04.436311 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:04.437318 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.437300 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:04.437386 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:04.437362 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:04.438418 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.438396 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bpcgz" Apr 16 16:48:04.438885 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.438867 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:48:04.438990 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.438867 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:48:04.439052 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.438872 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-9m6xk\"" Apr 16 16:48:04.439694 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.439675 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.440889 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.440731 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kjxrk\"" Apr 16 16:48:04.440889 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.440739 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:04.441415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.441386 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:48:04.441415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.441398 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:48:04.441553 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.441386 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:48:04.441984 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.441967 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:48:04.442093 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.441980 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2sb4n\"" Apr 16 16:48:04.442254 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.442238 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:48:04.442321 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.442262 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:48:04.443276 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.443253 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.443366 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.443307 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.443536 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.443518 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:48:04.443690 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.443665 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:48:04.443954 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.443934 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:48:04.444156 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.444141 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:48:04.444227 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.444156 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cnlvd\"" Apr 16 16:48:04.444227 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.444206 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:48:04.444486 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.444471 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:48:04.444544 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.444504 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9tnfz" Apr 16 16:48:04.445495 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.445480 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-56j9w\"" Apr 16 16:48:04.445876 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.445860 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.446038 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.446019 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:48:04.446220 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.446199 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:48:04.446220 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.446218 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:48:04.446363 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.446026 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:48:04.446425 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.446411 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:48:04.446629 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.446614 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6d8vz\"" Apr 16 16:48:04.446741 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.446725 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:48:04.447130 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.447112 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.447360 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.447235 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:48:04.447360 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.447264 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:48:04.447360 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.447274 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6pxz4\"" Apr 16 16:48:04.448504 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.448486 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:48:04.448504 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.448497 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:48:04.448709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.448694 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:48:04.448773 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.448728 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:48:04.449528 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.449508 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-m6m8m\"" Apr 16 16:48:04.450144 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.450126 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nxv7k\"" Apr 16 16:48:04.450312 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.450142 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:48:04.450388 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.450364 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:48:04.451191 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.451171 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44v89\" (UniqueName: \"kubernetes.io/projected/caaa1d46-b551-4960-b546-994b0ee36fed-kube-api-access-44v89\") pod \"node-ca-9tnfz\" (UID: \"caaa1d46-b551-4960-b546-994b0ee36fed\") " pod="openshift-image-registry/node-ca-9tnfz" Apr 16 16:48:04.451330 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.451313 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-cnibin\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.451431 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.451418 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-var-lib-cni-multus\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.451523 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.451512 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74ade214-8512-4cf5-93e8-0ece0e5776f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.452135 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.451651 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269-hosts-file\") pod \"node-resolver-p5bf5\" (UID: \"b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269\") " pod="openshift-dns/node-resolver-p5bf5" Apr 16 16:48:04.452234 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452158 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-tuned\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.452234 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-socket-dir-parent\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.452234 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452225 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-run-netns\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.452370 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-kubernetes\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.452370 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452278 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-var-lib-kubelet\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.452370 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-run-systemd\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.452370 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452333 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-log-socket\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.452370 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c0c5c0a0-29b2-4743-af7a-0c1150829a60-env-overrides\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.452588 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452388 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c0c5c0a0-29b2-4743-af7a-0c1150829a60-ovn-node-metrics-cert\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.452588 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-modprobe-d\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.452588 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452439 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9f643739-4068-4891-858f-02df7c38bdb7-iptables-alerter-script\") pod \"iptables-alerter-bpcgz\" (UID: \"9f643739-4068-4891-858f-02df7c38bdb7\") " pod="openshift-network-operator/iptables-alerter-bpcgz" Apr 16 16:48:04.452588 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/caaa1d46-b551-4960-b546-994b0ee36fed-host\") pod \"node-ca-9tnfz\" (UID: \"caaa1d46-b551-4960-b546-994b0ee36fed\") " pod="openshift-image-registry/node-ca-9tnfz" Apr 16 16:48:04.452588 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452499 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-hostroot\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.452588 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74ade214-8512-4cf5-93e8-0ece0e5776f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.452588 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452569 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fb92\" (UniqueName: \"kubernetes.io/projected/b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269-kube-api-access-2fb92\") pod \"node-resolver-p5bf5\" (UID: \"b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269\") " pod="openshift-dns/node-resolver-p5bf5" Apr 16 16:48:04.452834 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-kubelet\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.452834 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c0c5c0a0-29b2-4743-af7a-0c1150829a60-ovnkube-script-lib\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.452834 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.452731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tsb8\" (UniqueName: \"kubernetes.io/projected/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-kube-api-access-6tsb8\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.453078 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453039 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr7pr\" (UniqueName: \"kubernetes.io/projected/9f643739-4068-4891-858f-02df7c38bdb7-kube-api-access-lr7pr\") pod \"iptables-alerter-bpcgz\" (UID: \"9f643739-4068-4891-858f-02df7c38bdb7\") " pod="openshift-network-operator/iptables-alerter-bpcgz" Apr 16 16:48:04.453175 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453097 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-system-cni-dir\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.453175 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453123 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-conf-dir\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.453175 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453144 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-system-cni-dir\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.453320 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-var-lib-openvswitch\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.453320 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453196 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-run-ovn\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.453320 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453224 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.453320 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-device-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.453320 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453268 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-cnibin\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.453320 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453288 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-cni-bin\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.453320 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453308 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c0c5c0a0-29b2-4743-af7a-0c1150829a60-ovnkube-config\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.453579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b78e00dd-6abc-4e46-83bf-28cd51e87cc9-konnectivity-ca\") pod \"konnectivity-agent-4chxn\" (UID: \"b78e00dd-6abc-4e46-83bf-28cd51e87cc9\") " pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:04.453579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453354 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-registration-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.453579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-sys-fs\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.453579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453404 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-sysconfig\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.453579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453427 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-sysctl-conf\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.453579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453464 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/caaa1d46-b551-4960-b546-994b0ee36fed-serviceca\") pod \"node-ca-9tnfz\" (UID: \"caaa1d46-b551-4960-b546-994b0ee36fed\") " pod="openshift-image-registry/node-ca-9tnfz" Apr 16 16:48:04.453579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453488 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.453579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453509 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-etc-selinux\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.453579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453530 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-host\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.453579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453556 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22c0427d-87ec-49df-bb00-e6b332332ea9-tmp\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.453579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w75zf\" (UniqueName: \"kubernetes.io/projected/890f4655-f936-4bb9-b82c-524efb501585-kube-api-access-w75zf\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453586 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-os-release\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453605 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-cni-binary-copy\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-var-lib-cni-bin\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-var-lib-kubelet\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453689 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-daemon-config\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269-tmp-dir\") pod \"node-resolver-p5bf5\" (UID: \"b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269\") " pod="openshift-dns/node-resolver-p5bf5" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453730 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-systemd\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453744 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-sys\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453780 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f643739-4068-4891-858f-02df7c38bdb7-host-slash\") pod \"iptables-alerter-bpcgz\" (UID: \"9f643739-4068-4891-858f-02df7c38bdb7\") " pod="openshift-network-operator/iptables-alerter-bpcgz" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-run-multus-certs\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453864 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-systemd-units\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453880 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453911 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-run\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddln9\" (UniqueName: \"kubernetes.io/projected/22c0427d-87ec-49df-bb00-e6b332332ea9-kube-api-access-ddln9\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.453974 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-cni-dir\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.454016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454007 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-run-k8s-cni-cncf-io\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/74ade214-8512-4cf5-93e8-0ece0e5776f2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454083 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggqmc\" (UniqueName: \"kubernetes.io/projected/74ade214-8512-4cf5-93e8-0ece0e5776f2-kube-api-access-ggqmc\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454107 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-etc-openvswitch\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454132 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-run-openvswitch\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454155 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-lib-modules\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454178 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-etc-kubernetes\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454200 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-slash\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-run-ovn-kubernetes\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-sysctl-d\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454282 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc6j4\" (UniqueName: \"kubernetes.io/projected/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-kube-api-access-mc6j4\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-564lt\" (UniqueName: \"kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt\") pod \"network-check-target-s7xbf\" (UID: \"feb3fcea-2282-411d-bb57-2562cc290f0a\") " pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454327 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b78e00dd-6abc-4e46-83bf-28cd51e87cc9-agent-certs\") pod \"konnectivity-agent-4chxn\" (UID: \"b78e00dd-6abc-4e46-83bf-28cd51e87cc9\") " pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454360 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-node-log\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454387 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-cni-netd\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454448 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g79dx\" (UniqueName: \"kubernetes.io/projected/c0c5c0a0-29b2-4743-af7a-0c1150829a60-kube-api-access-g79dx\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.454709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-socket-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.455150 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454501 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-run-netns\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.455150 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.454535 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-os-release\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.455444 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.455427 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:48:04.470686 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.470667 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9ztx6" Apr 16 16:48:04.476011 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.475993 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9ztx6" Apr 16 16:48:04.534487 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:04.534457 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda676821af174364e03710748c5f10fbf.slice/crio-8fcc136cf441c4ea1f60f78cc4b2d2cc64f10dcb6d45551008c32d42dd678379 WatchSource:0}: Error finding container 8fcc136cf441c4ea1f60f78cc4b2d2cc64f10dcb6d45551008c32d42dd678379: Status 404 returned error can't find the container with id 8fcc136cf441c4ea1f60f78cc4b2d2cc64f10dcb6d45551008c32d42dd678379 Apr 16 16:48:04.534780 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:04.534761 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05d53b9573a1962a2ec184a8dbb43318.slice/crio-a0ebf992c75773ca961b31f27cd8317e339cd968ece3f449bbb3be917d9a4f42 WatchSource:0}: Error finding container a0ebf992c75773ca961b31f27cd8317e339cd968ece3f449bbb3be917d9a4f42: Status 404 returned error can't find the container with id a0ebf992c75773ca961b31f27cd8317e339cd968ece3f449bbb3be917d9a4f42 Apr 16 16:48:04.540658 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.540643 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:48:04.546922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.546902 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:48:04.555113 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-run-systemd\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.555188 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555125 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-log-socket\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.555188 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555142 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c0c5c0a0-29b2-4743-af7a-0c1150829a60-env-overrides\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.555188 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555162 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c0c5c0a0-29b2-4743-af7a-0c1150829a60-ovn-node-metrics-cert\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.555188 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555183 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-modprobe-d\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.555362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555188 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-run-systemd\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.555362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555204 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9f643739-4068-4891-858f-02df7c38bdb7-iptables-alerter-script\") pod \"iptables-alerter-bpcgz\" (UID: \"9f643739-4068-4891-858f-02df7c38bdb7\") " pod="openshift-network-operator/iptables-alerter-bpcgz" Apr 16 16:48:04.555362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555189 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-log-socket\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.555362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555248 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/caaa1d46-b551-4960-b546-994b0ee36fed-host\") pod \"node-ca-9tnfz\" (UID: \"caaa1d46-b551-4960-b546-994b0ee36fed\") " pod="openshift-image-registry/node-ca-9tnfz" Apr 16 16:48:04.555362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-hostroot\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.555362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555306 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74ade214-8512-4cf5-93e8-0ece0e5776f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.555362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-modprobe-d\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.555362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555354 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fb92\" (UniqueName: \"kubernetes.io/projected/b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269-kube-api-access-2fb92\") pod \"node-resolver-p5bf5\" (UID: \"b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269\") " pod="openshift-dns/node-resolver-p5bf5" Apr 16 16:48:04.555362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555358 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-hostroot\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/caaa1d46-b551-4960-b546-994b0ee36fed-host\") pod \"node-ca-9tnfz\" (UID: \"caaa1d46-b551-4960-b546-994b0ee36fed\") " pod="openshift-image-registry/node-ca-9tnfz" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-kubelet\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c0c5c0a0-29b2-4743-af7a-0c1150829a60-ovnkube-script-lib\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555500 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555612 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tsb8\" (UniqueName: \"kubernetes.io/projected/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-kube-api-access-6tsb8\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c0c5c0a0-29b2-4743-af7a-0c1150829a60-env-overrides\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lr7pr\" (UniqueName: \"kubernetes.io/projected/9f643739-4068-4891-858f-02df7c38bdb7-kube-api-access-lr7pr\") pod \"iptables-alerter-bpcgz\" (UID: \"9f643739-4068-4891-858f-02df7c38bdb7\") " pod="openshift-network-operator/iptables-alerter-bpcgz" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-system-cni-dir\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555703 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-conf-dir\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555728 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-system-cni-dir\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555733 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-kubelet\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-var-lib-openvswitch\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.555758 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9f643739-4068-4891-858f-02df7c38bdb7-iptables-alerter-script\") pod \"iptables-alerter-bpcgz\" (UID: \"9f643739-4068-4891-858f-02df7c38bdb7\") " pod="openshift-network-operator/iptables-alerter-bpcgz" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-system-cni-dir\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-run-ovn\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555805 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555808 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c0c5c0a0-29b2-4743-af7a-0c1150829a60-ovnkube-script-lib\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-device-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555821 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-run-ovn\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-cnibin\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555869 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74ade214-8512-4cf5-93e8-0ece0e5776f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555881 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-cni-bin\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555906 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-system-cni-dir\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555874 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-conf-dir\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-device-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555906 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c0c5c0a0-29b2-4743-af7a-0c1150829a60-ovnkube-config\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555947 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-cni-bin\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-var-lib-openvswitch\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.556381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b78e00dd-6abc-4e46-83bf-28cd51e87cc9-konnectivity-ca\") pod \"konnectivity-agent-4chxn\" (UID: \"b78e00dd-6abc-4e46-83bf-28cd51e87cc9\") " pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555994 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-registration-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-sys-fs\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.555971 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-cnibin\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556034 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-sysconfig\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556049 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-sys-fs\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556087 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-sysctl-conf\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-registration-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556115 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/caaa1d46-b551-4960-b546-994b0ee36fed-serviceca\") pod \"node-ca-9tnfz\" (UID: \"caaa1d46-b551-4960-b546-994b0ee36fed\") " pod="openshift-image-registry/node-ca-9tnfz" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556147 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-etc-selinux\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556131 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-sysconfig\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-host\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22c0427d-87ec-49df-bb00-e6b332332ea9-tmp\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w75zf\" (UniqueName: \"kubernetes.io/projected/890f4655-f936-4bb9-b82c-524efb501585-kube-api-access-w75zf\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-etc-selinux\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556258 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-os-release\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.557194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-cni-binary-copy\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556292 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-host\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-var-lib-cni-bin\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556342 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-var-lib-kubelet\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-daemon-config\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269-tmp-dir\") pod \"node-resolver-p5bf5\" (UID: \"b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269\") " pod="openshift-dns/node-resolver-p5bf5" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-systemd\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556441 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-sys\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556463 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f643739-4068-4891-858f-02df7c38bdb7-host-slash\") pod \"iptables-alerter-bpcgz\" (UID: \"9f643739-4068-4891-858f-02df7c38bdb7\") " pod="openshift-network-operator/iptables-alerter-bpcgz" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556475 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c0c5c0a0-29b2-4743-af7a-0c1150829a60-ovnkube-config\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-run-multus-certs\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/caaa1d46-b551-4960-b546-994b0ee36fed-serviceca\") pod \"node-ca-9tnfz\" (UID: \"caaa1d46-b551-4960-b546-994b0ee36fed\") " pod="openshift-image-registry/node-ca-9tnfz" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-systemd-units\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b78e00dd-6abc-4e46-83bf-28cd51e87cc9-konnectivity-ca\") pod \"konnectivity-agent-4chxn\" (UID: \"b78e00dd-6abc-4e46-83bf-28cd51e87cc9\") " pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556596 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-run\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556601 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-sysctl-conf\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.557983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddln9\" (UniqueName: \"kubernetes.io/projected/22c0427d-87ec-49df-bb00-e6b332332ea9-kube-api-access-ddln9\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-os-release\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-cni-dir\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556679 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f643739-4068-4891-858f-02df7c38bdb7-host-slash\") pod \"iptables-alerter-bpcgz\" (UID: \"9f643739-4068-4891-858f-02df7c38bdb7\") " pod="openshift-network-operator/iptables-alerter-bpcgz" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556694 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-systemd-units\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-run-k8s-cni-cncf-io\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556706 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-cni-dir\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556732 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-run-multus-certs\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556748 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/74ade214-8512-4cf5-93e8-0ece0e5776f2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggqmc\" (UniqueName: \"kubernetes.io/projected/74ade214-8512-4cf5-93e8-0ece0e5776f2-kube-api-access-ggqmc\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:04.556793 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556765 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556806 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-etc-openvswitch\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556837 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-cni-binary-copy\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-run\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-etc-openvswitch\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:04.556854 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs podName:890f4655-f936-4bb9-b82c-524efb501585 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:05.056819395 +0000 UTC m=+2.225530737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs") pod "network-metrics-daemon-dv6f9" (UID: "890f4655-f936-4bb9-b82c-524efb501585") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:04.558803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556889 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-run-openvswitch\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556909 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-var-lib-cni-bin\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-lib-modules\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-etc-kubernetes\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556947 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-var-lib-kubelet\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-slash\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.556996 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557037 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-systemd\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557048 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-run-ovn-kubernetes\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-slash\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-run-ovn-kubernetes\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557092 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-run-openvswitch\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-sysctl-d\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557113 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-etc-kubernetes\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mc6j4\" (UniqueName: \"kubernetes.io/projected/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-kube-api-access-mc6j4\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-564lt\" (UniqueName: \"kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt\") pod \"network-check-target-s7xbf\" (UID: \"feb3fcea-2282-411d-bb57-2562cc290f0a\") " pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557187 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b78e00dd-6abc-4e46-83bf-28cd51e87cc9-agent-certs\") pod \"konnectivity-agent-4chxn\" (UID: \"b78e00dd-6abc-4e46-83bf-28cd51e87cc9\") " pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557191 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-lib-modules\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.559415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-sysctl-d\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557212 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-node-log\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-sys\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557230 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269-tmp-dir\") pod \"node-resolver-p5bf5\" (UID: \"b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269\") " pod="openshift-dns/node-resolver-p5bf5" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-cni-netd\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557265 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g79dx\" (UniqueName: \"kubernetes.io/projected/c0c5c0a0-29b2-4743-af7a-0c1150829a60-kube-api-access-g79dx\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557267 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-run-k8s-cni-cncf-io\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557364 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-cni-netd\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-node-log\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-socket-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-run-netns\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557511 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-os-release\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557538 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44v89\" (UniqueName: \"kubernetes.io/projected/caaa1d46-b551-4960-b546-994b0ee36fed-kube-api-access-44v89\") pod \"node-ca-9tnfz\" (UID: \"caaa1d46-b551-4960-b546-994b0ee36fed\") " pod="openshift-image-registry/node-ca-9tnfz" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557563 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-cnibin\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557589 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-var-lib-cni-multus\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557607 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-run-netns\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74ade214-8512-4cf5-93e8-0ece0e5776f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557642 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269-hosts-file\") pod \"node-resolver-p5bf5\" (UID: \"b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269\") " pod="openshift-dns/node-resolver-p5bf5" Apr 16 16:48:04.559880 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-tuned\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557679 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/74ade214-8512-4cf5-93e8-0ece0e5776f2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557691 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-socket-dir-parent\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557719 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-run-netns\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-socket-dir\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557741 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-host-var-lib-cni-multus\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-kubernetes\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557775 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-var-lib-kubelet\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557805 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-cnibin\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557841 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-var-lib-kubelet\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557863 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74ade214-8512-4cf5-93e8-0ece0e5776f2-os-release\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557890 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0c5c0a0-29b2-4743-af7a-0c1150829a60-host-run-netns\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-kubernetes\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269-hosts-file\") pod \"node-resolver-p5bf5\" (UID: \"b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269\") " pod="openshift-dns/node-resolver-p5bf5" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.557977 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-socket-dir-parent\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.558274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74ade214-8512-4cf5-93e8-0ece0e5776f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.558369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-multus-daemon-config\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.560404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.559332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22c0427d-87ec-49df-bb00-e6b332332ea9-tmp\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.560853 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.559450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c0c5c0a0-29b2-4743-af7a-0c1150829a60-ovn-node-metrics-cert\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.560853 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.559780 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b78e00dd-6abc-4e46-83bf-28cd51e87cc9-agent-certs\") pod \"konnectivity-agent-4chxn\" (UID: \"b78e00dd-6abc-4e46-83bf-28cd51e87cc9\") " pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:04.560853 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.559939 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/22c0427d-87ec-49df-bb00-e6b332332ea9-etc-tuned\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.565589 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.565564 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fb92\" (UniqueName: \"kubernetes.io/projected/b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269-kube-api-access-2fb92\") pod \"node-resolver-p5bf5\" (UID: \"b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269\") " pod="openshift-dns/node-resolver-p5bf5" Apr 16 16:48:04.566594 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:04.566574 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:04.566594 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:04.566596 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:04.566752 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:04.566610 2572 projected.go:194] Error preparing data for projected volume kube-api-access-564lt for pod openshift-network-diagnostics/network-check-target-s7xbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:04.566752 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:04.566671 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt podName:feb3fcea-2282-411d-bb57-2562cc290f0a nodeName:}" failed. No retries permitted until 2026-04-16 16:48:05.066652333 +0000 UTC m=+2.235363689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-564lt" (UniqueName: "kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt") pod "network-check-target-s7xbf" (UID: "feb3fcea-2282-411d-bb57-2562cc290f0a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:04.568672 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.568637 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc6j4\" (UniqueName: \"kubernetes.io/projected/ce4c4ac0-c90c-484b-aa61-731d09fce8d3-kube-api-access-mc6j4\") pod \"multus-8jhjk\" (UID: \"ce4c4ac0-c90c-484b-aa61-731d09fce8d3\") " pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.568793 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.568774 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr7pr\" (UniqueName: \"kubernetes.io/projected/9f643739-4068-4891-858f-02df7c38bdb7-kube-api-access-lr7pr\") pod \"iptables-alerter-bpcgz\" (UID: \"9f643739-4068-4891-858f-02df7c38bdb7\") " pod="openshift-network-operator/iptables-alerter-bpcgz" Apr 16 16:48:04.568870 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.568807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tsb8\" (UniqueName: \"kubernetes.io/projected/76470f07-89a2-4aa9-b5e0-2d90fd9048ab-kube-api-access-6tsb8\") pod \"aws-ebs-csi-driver-node-trvsb\" (UID: \"76470f07-89a2-4aa9-b5e0-2d90fd9048ab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.568924 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.568875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddln9\" (UniqueName: \"kubernetes.io/projected/22c0427d-87ec-49df-bb00-e6b332332ea9-kube-api-access-ddln9\") pod \"tuned-kkn6k\" (UID: \"22c0427d-87ec-49df-bb00-e6b332332ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.568924 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.568875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggqmc\" (UniqueName: \"kubernetes.io/projected/74ade214-8512-4cf5-93e8-0ece0e5776f2-kube-api-access-ggqmc\") pod \"multus-additional-cni-plugins-92gx2\" (UID: \"74ade214-8512-4cf5-93e8-0ece0e5776f2\") " pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.569269 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.569254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w75zf\" (UniqueName: \"kubernetes.io/projected/890f4655-f936-4bb9-b82c-524efb501585-kube-api-access-w75zf\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:04.569631 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.569609 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44v89\" (UniqueName: \"kubernetes.io/projected/caaa1d46-b551-4960-b546-994b0ee36fed-kube-api-access-44v89\") pod \"node-ca-9tnfz\" (UID: \"caaa1d46-b551-4960-b546-994b0ee36fed\") " pod="openshift-image-registry/node-ca-9tnfz" Apr 16 16:48:04.570307 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.570289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g79dx\" (UniqueName: \"kubernetes.io/projected/c0c5c0a0-29b2-4743-af7a-0c1150829a60-kube-api-access-g79dx\") pod \"ovnkube-node-brhp4\" (UID: \"c0c5c0a0-29b2-4743-af7a-0c1150829a60\") " pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.588080 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.587993 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" event={"ID":"05d53b9573a1962a2ec184a8dbb43318","Type":"ContainerStarted","Data":"a0ebf992c75773ca961b31f27cd8317e339cd968ece3f449bbb3be917d9a4f42"} Apr 16 16:48:04.589003 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.588983 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-126.ec2.internal" event={"ID":"a676821af174364e03710748c5f10fbf","Type":"ContainerStarted","Data":"8fcc136cf441c4ea1f60f78cc4b2d2cc64f10dcb6d45551008c32d42dd678379"} Apr 16 16:48:04.755571 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.755537 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p5bf5" Apr 16 16:48:04.761180 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:04.761155 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3dfeb2f_a4ab_4fe5_ab2c_6c8dacead269.slice/crio-5baadfcdad2581db1535aab13c43ef23ec5feb5bb280d333aa8f9d61d304d408 WatchSource:0}: Error finding container 5baadfcdad2581db1535aab13c43ef23ec5feb5bb280d333aa8f9d61d304d408: Status 404 returned error can't find the container with id 5baadfcdad2581db1535aab13c43ef23ec5feb5bb280d333aa8f9d61d304d408 Apr 16 16:48:04.770998 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.770980 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bpcgz" Apr 16 16:48:04.777017 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:04.776997 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f643739_4068_4891_858f_02df7c38bdb7.slice/crio-61e95dd4101fb82f207d3eaa3abeae9b8e300b024490a7fcfd87c04ef1d0f9aa WatchSource:0}: Error finding container 61e95dd4101fb82f207d3eaa3abeae9b8e300b024490a7fcfd87c04ef1d0f9aa: Status 404 returned error can't find the container with id 61e95dd4101fb82f207d3eaa3abeae9b8e300b024490a7fcfd87c04ef1d0f9aa Apr 16 16:48:04.784077 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.784047 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:04.789735 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:04.789716 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c5c0a0_29b2_4743_af7a_0c1150829a60.slice/crio-1afc599ba84475f2a2107bcf52b2cdf20afed7b9f7e2c73e030c8b5e425c0ed2 WatchSource:0}: Error finding container 1afc599ba84475f2a2107bcf52b2cdf20afed7b9f7e2c73e030c8b5e425c0ed2: Status 404 returned error can't find the container with id 1afc599ba84475f2a2107bcf52b2cdf20afed7b9f7e2c73e030c8b5e425c0ed2 Apr 16 16:48:04.800242 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.800226 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:04.805672 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.805655 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" Apr 16 16:48:04.807156 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:04.807137 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb78e00dd_6abc_4e46_83bf_28cd51e87cc9.slice/crio-42806b9d94ab50b2d749d91d7162603f432a6293a642b40bb1f60f29b3752a1a WatchSource:0}: Error finding container 42806b9d94ab50b2d749d91d7162603f432a6293a642b40bb1f60f29b3752a1a: Status 404 returned error can't find the container with id 42806b9d94ab50b2d749d91d7162603f432a6293a642b40bb1f60f29b3752a1a Apr 16 16:48:04.812269 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:04.812250 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76470f07_89a2_4aa9_b5e0_2d90fd9048ab.slice/crio-b74b5e2932cdbba83561d563dbb392e4601aa053029daea2a75ba3f17ce85555 WatchSource:0}: Error finding container b74b5e2932cdbba83561d563dbb392e4601aa053029daea2a75ba3f17ce85555: Status 404 returned error can't find the container with id b74b5e2932cdbba83561d563dbb392e4601aa053029daea2a75ba3f17ce85555 Apr 16 16:48:04.835457 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.835438 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" Apr 16 16:48:04.840211 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:04.840192 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22c0427d_87ec_49df_bb00_e6b332332ea9.slice/crio-d7f28ff295695327faa9df91e51bbe188f1bf058e04ac94e4226a1478f3e0c5f WatchSource:0}: Error finding container d7f28ff295695327faa9df91e51bbe188f1bf058e04ac94e4226a1478f3e0c5f: Status 404 returned error can't find the container with id d7f28ff295695327faa9df91e51bbe188f1bf058e04ac94e4226a1478f3e0c5f Apr 16 16:48:04.840710 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.840693 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9tnfz" Apr 16 16:48:04.845831 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:04.845812 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaaa1d46_b551_4960_b546_994b0ee36fed.slice/crio-f00a6814940e145786ec153e0bee0175a4c30e31fd79e4d56de73890bd5fcd3d WatchSource:0}: Error finding container f00a6814940e145786ec153e0bee0175a4c30e31fd79e4d56de73890bd5fcd3d: Status 404 returned error can't find the container with id f00a6814940e145786ec153e0bee0175a4c30e31fd79e4d56de73890bd5fcd3d Apr 16 16:48:04.846898 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.846881 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8jhjk" Apr 16 16:48:04.851885 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:04.851869 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-92gx2" Apr 16 16:48:04.852960 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:04.852932 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce4c4ac0_c90c_484b_aa61_731d09fce8d3.slice/crio-56b92b3de58c1f29d134953f9de41f5183f687f40451898024d5d1da03eaf363 WatchSource:0}: Error finding container 56b92b3de58c1f29d134953f9de41f5183f687f40451898024d5d1da03eaf363: Status 404 returned error can't find the container with id 56b92b3de58c1f29d134953f9de41f5183f687f40451898024d5d1da03eaf363 Apr 16 16:48:04.857643 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:04.857625 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74ade214_8512_4cf5_93e8_0ece0e5776f2.slice/crio-610e0801a6e30cbb5dc7859a7f98cdc8ee54d040e55fd0ac079e1b85353df2c9 WatchSource:0}: Error finding container 610e0801a6e30cbb5dc7859a7f98cdc8ee54d040e55fd0ac079e1b85353df2c9: Status 404 returned error can't find the container with id 610e0801a6e30cbb5dc7859a7f98cdc8ee54d040e55fd0ac079e1b85353df2c9 Apr 16 16:48:05.061508 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.061473 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:05.061690 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:05.061639 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:05.061757 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:05.061696 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs podName:890f4655-f936-4bb9-b82c-524efb501585 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:06.061678985 +0000 UTC m=+3.230390345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs") pod "network-metrics-daemon-dv6f9" (UID: "890f4655-f936-4bb9-b82c-524efb501585") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:05.162537 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.161912 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-564lt\" (UniqueName: \"kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt\") pod \"network-check-target-s7xbf\" (UID: \"feb3fcea-2282-411d-bb57-2562cc290f0a\") " pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:05.162537 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:05.162090 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:05.162537 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:05.162112 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:05.162537 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:05.162125 2572 projected.go:194] Error preparing data for projected volume kube-api-access-564lt for pod openshift-network-diagnostics/network-check-target-s7xbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:05.162537 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:05.162185 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt podName:feb3fcea-2282-411d-bb57-2562cc290f0a nodeName:}" failed. No retries permitted until 2026-04-16 16:48:06.162165323 +0000 UTC m=+3.330876671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-564lt" (UniqueName: "kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt") pod "network-check-target-s7xbf" (UID: "feb3fcea-2282-411d-bb57-2562cc290f0a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:05.435952 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.435861 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:05.477045 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.477009 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:43:04 +0000 UTC" deadline="2028-01-19 11:37:17.90520871 +0000 UTC" Apr 16 16:48:05.477045 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.477044 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15426h49m12.42816928s" Apr 16 16:48:05.585933 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.585902 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:05.586113 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:05.586041 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:05.586708 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.586471 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:05.586708 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:05.586559 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:05.609824 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.609792 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92gx2" event={"ID":"74ade214-8512-4cf5-93e8-0ece0e5776f2","Type":"ContainerStarted","Data":"610e0801a6e30cbb5dc7859a7f98cdc8ee54d040e55fd0ac079e1b85353df2c9"} Apr 16 16:48:05.638951 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.638892 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8jhjk" event={"ID":"ce4c4ac0-c90c-484b-aa61-731d09fce8d3","Type":"ContainerStarted","Data":"56b92b3de58c1f29d134953f9de41f5183f687f40451898024d5d1da03eaf363"} Apr 16 16:48:05.654404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.654371 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9tnfz" event={"ID":"caaa1d46-b551-4960-b546-994b0ee36fed","Type":"ContainerStarted","Data":"f00a6814940e145786ec153e0bee0175a4c30e31fd79e4d56de73890bd5fcd3d"} Apr 16 16:48:05.655718 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.655693 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4chxn" event={"ID":"b78e00dd-6abc-4e46-83bf-28cd51e87cc9","Type":"ContainerStarted","Data":"42806b9d94ab50b2d749d91d7162603f432a6293a642b40bb1f60f29b3752a1a"} Apr 16 16:48:05.661303 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.661278 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" event={"ID":"c0c5c0a0-29b2-4743-af7a-0c1150829a60","Type":"ContainerStarted","Data":"1afc599ba84475f2a2107bcf52b2cdf20afed7b9f7e2c73e030c8b5e425c0ed2"} Apr 16 16:48:05.667097 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.667058 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bpcgz" event={"ID":"9f643739-4068-4891-858f-02df7c38bdb7","Type":"ContainerStarted","Data":"61e95dd4101fb82f207d3eaa3abeae9b8e300b024490a7fcfd87c04ef1d0f9aa"} Apr 16 16:48:05.672232 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.672203 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" event={"ID":"22c0427d-87ec-49df-bb00-e6b332332ea9","Type":"ContainerStarted","Data":"d7f28ff295695327faa9df91e51bbe188f1bf058e04ac94e4226a1478f3e0c5f"} Apr 16 16:48:05.679692 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.679662 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" event={"ID":"76470f07-89a2-4aa9-b5e0-2d90fd9048ab","Type":"ContainerStarted","Data":"b74b5e2932cdbba83561d563dbb392e4601aa053029daea2a75ba3f17ce85555"} Apr 16 16:48:05.694496 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.694432 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p5bf5" event={"ID":"b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269","Type":"ContainerStarted","Data":"5baadfcdad2581db1535aab13c43ef23ec5feb5bb280d333aa8f9d61d304d408"} Apr 16 16:48:05.729876 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.729833 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:05.861624 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:05.861588 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:06.068834 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:06.068380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:06.068834 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:06.068538 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:06.068834 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:06.068621 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs podName:890f4655-f936-4bb9-b82c-524efb501585 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:08.068582282 +0000 UTC m=+5.237293630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs") pod "network-metrics-daemon-dv6f9" (UID: "890f4655-f936-4bb9-b82c-524efb501585") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:06.169469 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:06.169433 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-564lt\" (UniqueName: \"kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt\") pod \"network-check-target-s7xbf\" (UID: \"feb3fcea-2282-411d-bb57-2562cc290f0a\") " pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:06.169654 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:06.169582 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:06.169654 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:06.169601 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:06.169654 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:06.169614 2572 projected.go:194] Error preparing data for projected volume kube-api-access-564lt for pod openshift-network-diagnostics/network-check-target-s7xbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:06.169809 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:06.169680 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt podName:feb3fcea-2282-411d-bb57-2562cc290f0a nodeName:}" failed. No retries permitted until 2026-04-16 16:48:08.169661807 +0000 UTC m=+5.338373155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-564lt" (UniqueName: "kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt") pod "network-check-target-s7xbf" (UID: "feb3fcea-2282-411d-bb57-2562cc290f0a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:06.477725 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:06.477585 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:43:04 +0000 UTC" deadline="2027-12-24 18:14:32.225812732 +0000 UTC" Apr 16 16:48:06.477725 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:06.477682 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14809h26m25.74816759s" Apr 16 16:48:07.586512 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:07.586244 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:07.586991 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:07.586634 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:07.589620 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:07.587101 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:07.589620 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:07.587195 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:08.088426 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:08.088392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:08.088636 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:08.088608 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:08.088712 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:08.088690 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs podName:890f4655-f936-4bb9-b82c-524efb501585 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:12.088670077 +0000 UTC m=+9.257381423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs") pod "network-metrics-daemon-dv6f9" (UID: "890f4655-f936-4bb9-b82c-524efb501585") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:08.190290 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:08.189675 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-564lt\" (UniqueName: \"kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt\") pod \"network-check-target-s7xbf\" (UID: \"feb3fcea-2282-411d-bb57-2562cc290f0a\") " pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:08.190290 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:08.189827 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:08.190290 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:08.189847 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:08.190290 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:08.189860 2572 projected.go:194] Error preparing data for projected volume kube-api-access-564lt for pod openshift-network-diagnostics/network-check-target-s7xbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:08.190290 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:08.189918 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt podName:feb3fcea-2282-411d-bb57-2562cc290f0a nodeName:}" failed. No retries permitted until 2026-04-16 16:48:12.189899434 +0000 UTC m=+9.358610781 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-564lt" (UniqueName: "kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt") pod "network-check-target-s7xbf" (UID: "feb3fcea-2282-411d-bb57-2562cc290f0a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:09.589570 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:09.589542 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:09.590021 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:09.589542 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:09.590021 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:09.589684 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:09.590021 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:09.589806 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:11.585835 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:11.585765 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:11.585835 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:11.585809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:11.586384 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:11.585903 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:11.586384 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:11.586179 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:12.126583 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:12.126019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:12.126583 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:12.126173 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:12.126583 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:12.126242 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs podName:890f4655-f936-4bb9-b82c-524efb501585 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:20.126222856 +0000 UTC m=+17.294934204 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs") pod "network-metrics-daemon-dv6f9" (UID: "890f4655-f936-4bb9-b82c-524efb501585") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:12.227511 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:12.227471 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-564lt\" (UniqueName: \"kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt\") pod \"network-check-target-s7xbf\" (UID: \"feb3fcea-2282-411d-bb57-2562cc290f0a\") " pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:12.227678 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:12.227650 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:12.227736 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:12.227679 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:12.227736 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:12.227693 2572 projected.go:194] Error preparing data for projected volume kube-api-access-564lt for pod openshift-network-diagnostics/network-check-target-s7xbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:12.227836 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:12.227784 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt podName:feb3fcea-2282-411d-bb57-2562cc290f0a nodeName:}" failed. No retries permitted until 2026-04-16 16:48:20.227764172 +0000 UTC m=+17.396475528 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-564lt" (UniqueName: "kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt") pod "network-check-target-s7xbf" (UID: "feb3fcea-2282-411d-bb57-2562cc290f0a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:13.586924 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:13.586883 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:13.587382 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:13.586984 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:13.587382 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:13.587354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:13.587497 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:13.587452 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:15.585794 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:15.585758 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:15.586245 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:15.585887 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:15.586245 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:15.585947 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:15.586245 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:15.586085 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:17.586076 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:17.586021 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:17.586076 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:17.586053 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:17.586597 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:17.586173 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:17.586597 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:17.586322 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:19.586155 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:19.586118 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:19.586720 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:19.586138 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:19.586720 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:19.586233 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:19.586720 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:19.586296 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:20.183796 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:20.183764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:20.184011 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:20.183899 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:20.184011 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:20.183959 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs podName:890f4655-f936-4bb9-b82c-524efb501585 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:36.183942103 +0000 UTC m=+33.352653451 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs") pod "network-metrics-daemon-dv6f9" (UID: "890f4655-f936-4bb9-b82c-524efb501585") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:20.284250 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:20.284221 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-564lt\" (UniqueName: \"kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt\") pod \"network-check-target-s7xbf\" (UID: \"feb3fcea-2282-411d-bb57-2562cc290f0a\") " pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:20.284406 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:20.284365 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:20.284406 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:20.284384 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:20.284406 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:20.284394 2572 projected.go:194] Error preparing data for projected volume kube-api-access-564lt for pod openshift-network-diagnostics/network-check-target-s7xbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:20.284553 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:20.284442 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt podName:feb3fcea-2282-411d-bb57-2562cc290f0a nodeName:}" failed. No retries permitted until 2026-04-16 16:48:36.284428547 +0000 UTC m=+33.453139888 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-564lt" (UniqueName: "kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt") pod "network-check-target-s7xbf" (UID: "feb3fcea-2282-411d-bb57-2562cc290f0a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:21.585827 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:21.585786 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:21.586413 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:21.585786 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:21.586413 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:21.585940 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:21.586413 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:21.585985 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:23.587637 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.587228 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:23.588418 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.587282 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:23.588418 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:23.587719 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:23.588418 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:23.587763 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:23.736429 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.736325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9tnfz" event={"ID":"caaa1d46-b551-4960-b546-994b0ee36fed","Type":"ContainerStarted","Data":"60899f571b4890eb843c19d1d8f0ad0fd1e7173b9616a03befc7348b55fe87d2"} Apr 16 16:48:23.737542 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.737516 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4chxn" event={"ID":"b78e00dd-6abc-4e46-83bf-28cd51e87cc9","Type":"ContainerStarted","Data":"0cf22fcae84a287493f418bed9de275b0e818c8e1331de23a01521542f8286be"} Apr 16 16:48:23.739702 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.739684 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 16:48:23.739968 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.739951 2572 generic.go:358] "Generic (PLEG): container finished" podID="c0c5c0a0-29b2-4743-af7a-0c1150829a60" containerID="601970a7e4f2c3ad920115de5869295bce81d9d09c33ddea4146e44c35a2c0ab" exitCode=1 Apr 16 16:48:23.740020 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.740007 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" event={"ID":"c0c5c0a0-29b2-4743-af7a-0c1150829a60","Type":"ContainerStarted","Data":"12e2f228e3a5cb5773df7080320041341b920d5c1c6fe12aa84783e053dd73bc"} Apr 16 16:48:23.740053 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.740026 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" event={"ID":"c0c5c0a0-29b2-4743-af7a-0c1150829a60","Type":"ContainerStarted","Data":"f5a29a3c4ed12ac7d1d6a62f4d91c0e6608f2221bcd25c906576b95c8b7f353c"} Apr 16 16:48:23.740053 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.740039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" event={"ID":"c0c5c0a0-29b2-4743-af7a-0c1150829a60","Type":"ContainerStarted","Data":"c3b5e9ac2a43250322b85dd58bfac93249da0e6cb0a0b38e43606f30b075f40b"} Apr 16 16:48:23.740053 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.740050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" event={"ID":"c0c5c0a0-29b2-4743-af7a-0c1150829a60","Type":"ContainerStarted","Data":"58350beff029cd342db528b853fc542c35b902a078d3981713d091e8639b8c59"} Apr 16 16:48:23.740181 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.740059 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" event={"ID":"c0c5c0a0-29b2-4743-af7a-0c1150829a60","Type":"ContainerDied","Data":"601970a7e4f2c3ad920115de5869295bce81d9d09c33ddea4146e44c35a2c0ab"} Apr 16 16:48:23.740181 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.740089 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" event={"ID":"c0c5c0a0-29b2-4743-af7a-0c1150829a60","Type":"ContainerStarted","Data":"418c24f83a8591d362fa5fc0f86919ae9a3526c230eb82657e796a334276a9d7"} Apr 16 16:48:23.741225 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.741206 2572 generic.go:358] "Generic (PLEG): container finished" podID="05d53b9573a1962a2ec184a8dbb43318" containerID="87ff37c9d656491b6863fc83e1488a1d82058cf34a0e46dcf05de668743be964" exitCode=0 Apr 16 16:48:23.741298 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.741236 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" event={"ID":"05d53b9573a1962a2ec184a8dbb43318","Type":"ContainerDied","Data":"87ff37c9d656491b6863fc83e1488a1d82058cf34a0e46dcf05de668743be964"} Apr 16 16:48:23.744193 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.744166 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" event={"ID":"22c0427d-87ec-49df-bb00-e6b332332ea9","Type":"ContainerStarted","Data":"bf3929cea1638d0f04109410b258b01f71034c8ec0156ac18912aec0f624a482"} Apr 16 16:48:23.745720 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.745702 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" event={"ID":"76470f07-89a2-4aa9-b5e0-2d90fd9048ab","Type":"ContainerStarted","Data":"400cb1bac774b77d577d6b5a3d77af9ff9f738d46f64abbc3de18c025ca31d0f"} Apr 16 16:48:23.746852 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.746834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p5bf5" event={"ID":"b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269","Type":"ContainerStarted","Data":"7361b5597d96ffaa6fd38af78b315942abe9aab796097e88568f703dad70fd57"} Apr 16 16:48:23.748016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.747997 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-126.ec2.internal" event={"ID":"a676821af174364e03710748c5f10fbf","Type":"ContainerStarted","Data":"877434ac3572dfecf4abb063e8768e73d037ebcd9d23d7a65a55a9feec84ca0e"} Apr 16 16:48:23.751630 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.751605 2572 generic.go:358] "Generic (PLEG): container finished" podID="74ade214-8512-4cf5-93e8-0ece0e5776f2" containerID="e9734edff120c5c7de6939509de34c3ce254ce8fe1268dd87e31035af1e65b54" exitCode=0 Apr 16 16:48:23.751715 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.751663 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92gx2" event={"ID":"74ade214-8512-4cf5-93e8-0ece0e5776f2","Type":"ContainerDied","Data":"e9734edff120c5c7de6939509de34c3ce254ce8fe1268dd87e31035af1e65b54"} Apr 16 16:48:23.752945 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.752890 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8jhjk" event={"ID":"ce4c4ac0-c90c-484b-aa61-731d09fce8d3","Type":"ContainerStarted","Data":"55d8662dcc54513c9628ba1a9c4bb4dcf37831b77e89b7fd35a9f62d45ef833f"} Apr 16 16:48:23.753409 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.753360 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9tnfz" podStartSLOduration=3.014666802 podStartE2EDuration="20.753347647s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:48:04.849496711 +0000 UTC m=+2.018208053" lastFinishedPulling="2026-04-16 16:48:22.588177353 +0000 UTC m=+19.756888898" observedRunningTime="2026-04-16 16:48:23.752584651 +0000 UTC m=+20.921296031" watchObservedRunningTime="2026-04-16 16:48:23.753347647 +0000 UTC m=+20.922059011" Apr 16 16:48:23.767839 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.767799 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-126.ec2.internal" podStartSLOduration=20.767784705 podStartE2EDuration="20.767784705s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:23.767657645 +0000 UTC m=+20.936369020" watchObservedRunningTime="2026-04-16 16:48:23.767784705 +0000 UTC m=+20.936496068" Apr 16 16:48:23.785691 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.785645 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4chxn" podStartSLOduration=11.068558904 podStartE2EDuration="20.785633518s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:48:04.809471526 +0000 UTC m=+1.978182869" lastFinishedPulling="2026-04-16 16:48:14.526546127 +0000 UTC m=+11.695257483" observedRunningTime="2026-04-16 16:48:23.785460591 +0000 UTC m=+20.954171950" watchObservedRunningTime="2026-04-16 16:48:23.785633518 +0000 UTC m=+20.954344881" Apr 16 16:48:23.802695 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.802650 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kkn6k" podStartSLOduration=3.027466665 podStartE2EDuration="20.802637357s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:48:04.841581575 +0000 UTC m=+2.010292916" lastFinishedPulling="2026-04-16 16:48:22.616752253 +0000 UTC m=+19.785463608" observedRunningTime="2026-04-16 16:48:23.802184484 +0000 UTC m=+20.970895847" watchObservedRunningTime="2026-04-16 16:48:23.802637357 +0000 UTC m=+20.971348720" Apr 16 16:48:23.818714 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.818679 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p5bf5" podStartSLOduration=2.9648160519999998 podStartE2EDuration="20.818667034s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:48:04.762597992 +0000 UTC m=+1.931309334" lastFinishedPulling="2026-04-16 16:48:22.616448962 +0000 UTC m=+19.785160316" observedRunningTime="2026-04-16 16:48:23.818631065 +0000 UTC m=+20.987342432" watchObservedRunningTime="2026-04-16 16:48:23.818667034 +0000 UTC m=+20.987378398" Apr 16 16:48:23.872724 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:23.872685 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8jhjk" podStartSLOduration=3.103550925 podStartE2EDuration="20.872672311s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:48:04.854898849 +0000 UTC m=+2.023610191" lastFinishedPulling="2026-04-16 16:48:22.624020235 +0000 UTC m=+19.792731577" observedRunningTime="2026-04-16 16:48:23.872051514 +0000 UTC m=+21.040762888" watchObservedRunningTime="2026-04-16 16:48:23.872672311 +0000 UTC m=+21.041383693" Apr 16 16:48:24.398617 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:24.398587 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:48:24.513308 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:24.513212 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:48:24.398606282Z","UUID":"717936de-e472-47ac-80e4-2a80413a6aec","Handler":null,"Name":"","Endpoint":""} Apr 16 16:48:24.516924 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:24.516853 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:48:24.516924 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:24.516885 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:48:24.755916 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:24.755882 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bpcgz" event={"ID":"9f643739-4068-4891-858f-02df7c38bdb7","Type":"ContainerStarted","Data":"b8f1a444f0aa922552576db681da67d5ae6e5c115e3c429828ee53bb43ff907b"} Apr 16 16:48:24.757618 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:24.757590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" event={"ID":"05d53b9573a1962a2ec184a8dbb43318","Type":"ContainerStarted","Data":"f60012215e3fbe28b801b48719030893cf5dc110327a8ec0694fcddada186208"} Apr 16 16:48:24.759384 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:24.759339 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" event={"ID":"76470f07-89a2-4aa9-b5e0-2d90fd9048ab","Type":"ContainerStarted","Data":"3c93c6632bed85bb9b7f4bb57b74129a36d692265139833f073621cccaadc28e"} Apr 16 16:48:24.783712 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:24.781390 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-bpcgz" podStartSLOduration=3.972147779 podStartE2EDuration="21.781359168s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:48:04.778572883 +0000 UTC m=+1.947284224" lastFinishedPulling="2026-04-16 16:48:22.58778427 +0000 UTC m=+19.756495613" observedRunningTime="2026-04-16 16:48:24.78092662 +0000 UTC m=+21.949637984" watchObservedRunningTime="2026-04-16 16:48:24.781359168 +0000 UTC m=+21.950070532" Apr 16 16:48:24.804566 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:24.804524 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-126.ec2.internal" podStartSLOduration=21.804509491 podStartE2EDuration="21.804509491s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:24.804289655 +0000 UTC m=+21.973001021" watchObservedRunningTime="2026-04-16 16:48:24.804509491 +0000 UTC m=+21.973220853" Apr 16 16:48:25.585635 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:25.585553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:25.585635 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:25.585600 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:25.585862 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:25.585672 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:25.585862 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:25.585815 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:25.763000 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:25.762961 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" event={"ID":"76470f07-89a2-4aa9-b5e0-2d90fd9048ab","Type":"ContainerStarted","Data":"87a9f343f0668008f525efd04ebc79bc63aaee573be4565bb56e8f18d4b6a38f"} Apr 16 16:48:25.766206 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:25.766184 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 16:48:25.766673 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:25.766645 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" event={"ID":"c0c5c0a0-29b2-4743-af7a-0c1150829a60","Type":"ContainerStarted","Data":"cd26177d36c33128069426c5fe01a8af5a33dcc6719ce273662b6326bbc1b015"} Apr 16 16:48:25.782211 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:25.782173 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-trvsb" podStartSLOduration=2.411122306 podStartE2EDuration="22.782159443s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:48:04.813533739 +0000 UTC m=+1.982245081" lastFinishedPulling="2026-04-16 16:48:25.184570863 +0000 UTC m=+22.353282218" observedRunningTime="2026-04-16 16:48:25.781738952 +0000 UTC m=+22.950450315" watchObservedRunningTime="2026-04-16 16:48:25.782159443 +0000 UTC m=+22.950870808" Apr 16 16:48:27.300323 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:27.300286 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:27.300985 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:27.300963 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:27.586351 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:27.586270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:27.586510 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:27.586420 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:27.586510 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:27.586469 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:27.586618 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:27.586572 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:27.770640 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:27.770607 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:27.771212 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:27.771190 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4chxn" Apr 16 16:48:28.773838 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:28.773667 2572 generic.go:358] "Generic (PLEG): container finished" podID="74ade214-8512-4cf5-93e8-0ece0e5776f2" containerID="aa4fa2de930d42c5974a482031ebfbdf2420f9ebc83526d74a3cde43139f69b2" exitCode=0 Apr 16 16:48:28.774625 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:28.773754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92gx2" event={"ID":"74ade214-8512-4cf5-93e8-0ece0e5776f2","Type":"ContainerDied","Data":"aa4fa2de930d42c5974a482031ebfbdf2420f9ebc83526d74a3cde43139f69b2"} Apr 16 16:48:28.777074 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:28.777047 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 16:48:28.777356 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:28.777337 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" event={"ID":"c0c5c0a0-29b2-4743-af7a-0c1150829a60","Type":"ContainerStarted","Data":"76072ea943473671c59c2b7a4d3489897116f3d641143b59ae7357ead27de504"} Apr 16 16:48:28.777583 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:28.777570 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:28.777629 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:28.777594 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:28.777709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:28.777695 2572 scope.go:117] "RemoveContainer" containerID="601970a7e4f2c3ad920115de5869295bce81d9d09c33ddea4146e44c35a2c0ab" Apr 16 16:48:28.792417 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:28.792400 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:29.586083 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:29.586038 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:29.586248 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:29.586202 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:29.586248 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:29.586043 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:29.586359 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:29.586312 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:29.783645 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:29.783419 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 16:48:29.784334 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:29.784296 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" event={"ID":"c0c5c0a0-29b2-4743-af7a-0c1150829a60","Type":"ContainerStarted","Data":"93e622fd31951fd7ecaeedb620eb1d9931ec0259c58f1e54b45c82972bd3b0d2"} Apr 16 16:48:29.784795 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:29.784772 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:29.802629 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:29.802584 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:48:29.817850 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:29.817795 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" podStartSLOduration=8.945345198 podStartE2EDuration="26.817780204s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:48:04.791192215 +0000 UTC m=+1.959903557" lastFinishedPulling="2026-04-16 16:48:22.663627222 +0000 UTC m=+19.832338563" observedRunningTime="2026-04-16 16:48:29.817461924 +0000 UTC m=+26.986173288" watchObservedRunningTime="2026-04-16 16:48:29.817780204 +0000 UTC m=+26.986491564" Apr 16 16:48:29.841008 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:29.840905 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s7xbf"] Apr 16 16:48:29.841152 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:29.841039 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:29.841200 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:29.841163 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:29.844119 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:29.844096 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dv6f9"] Apr 16 16:48:29.844226 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:29.844206 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:29.844336 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:29.844316 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:30.787154 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:30.787123 2572 generic.go:358] "Generic (PLEG): container finished" podID="74ade214-8512-4cf5-93e8-0ece0e5776f2" containerID="52f26d072650ffec45ff32be65933a359b248f0c5d982858b4ddb5f9a8a9c991" exitCode=0 Apr 16 16:48:30.787522 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:30.787214 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92gx2" event={"ID":"74ade214-8512-4cf5-93e8-0ece0e5776f2","Type":"ContainerDied","Data":"52f26d072650ffec45ff32be65933a359b248f0c5d982858b4ddb5f9a8a9c991"} Apr 16 16:48:31.585534 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:31.585504 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:31.585534 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:31.585520 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:31.585685 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:31.585594 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:31.585685 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:31.585655 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:32.792904 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:32.792869 2572 generic.go:358] "Generic (PLEG): container finished" podID="74ade214-8512-4cf5-93e8-0ece0e5776f2" containerID="31c9844630009ba222b687edcb1d056a9de437721c3803e3a03d6446f683e298" exitCode=0 Apr 16 16:48:32.793293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:32.792929 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92gx2" event={"ID":"74ade214-8512-4cf5-93e8-0ece0e5776f2","Type":"ContainerDied","Data":"31c9844630009ba222b687edcb1d056a9de437721c3803e3a03d6446f683e298"} Apr 16 16:48:33.587347 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:33.587318 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:33.587524 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:33.587424 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:33.587605 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:33.587517 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:33.587724 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:33.587694 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:35.585984 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.585951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:35.586356 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:35.586086 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s7xbf" podUID="feb3fcea-2282-411d-bb57-2562cc290f0a" Apr 16 16:48:35.586356 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.586100 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:35.586356 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:35.586214 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv6f9" podUID="890f4655-f936-4bb9-b82c-524efb501585" Apr 16 16:48:35.628812 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.628779 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-126.ec2.internal" event="NodeReady" Apr 16 16:48:35.628961 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.628950 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:48:35.663087 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.663043 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b8c6f94d8-cghq8"] Apr 16 16:48:35.683950 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.683914 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-k7p77"] Apr 16 16:48:35.684154 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.684134 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.687129 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.686937 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:48:35.687129 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.686959 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:48:35.687309 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.687294 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:48:35.687367 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.687344 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lm58j\"" Apr 16 16:48:35.695103 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.695051 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:48:35.702868 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.702848 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b8c6f94d8-cghq8"] Apr 16 16:48:35.702962 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.702871 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bdzw7"] Apr 16 16:48:35.703033 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.703009 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.706279 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.706257 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:48:35.706476 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.706455 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:48:35.706586 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.706527 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fwjqf\"" Apr 16 16:48:35.720638 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.720618 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k7p77"] Apr 16 16:48:35.720718 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.720645 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bdzw7"] Apr 16 16:48:35.720771 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.720737 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:48:35.723816 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.723795 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gwwfm\"" Apr 16 16:48:35.723924 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.723911 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:48:35.723995 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.723955 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:48:35.724056 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.724005 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:48:35.796423 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796353 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-installation-pull-secrets\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.796423 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmt9c\" (UniqueName: \"kubernetes.io/projected/535dfd9c-5e07-4e18-886d-57be1138629f-kube-api-access-dmt9c\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:48:35.796423 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796407 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74b52111-9c5e-4b37-ab68-e34630312fcb-config-volume\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.796657 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796469 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-image-registry-private-configuration\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.796657 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796494 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-ca-trust-extracted\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.796657 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796515 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-bound-sa-token\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.796657 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796539 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-certificates\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.796657 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796556 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74b52111-9c5e-4b37-ab68-e34630312fcb-tmp-dir\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.796657 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796577 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.796657 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:48:35.796657 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.797031 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796673 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm9mr\" (UniqueName: \"kubernetes.io/projected/74b52111-9c5e-4b37-ab68-e34630312fcb-kube-api-access-pm9mr\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.797031 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796695 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rmql\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-kube-api-access-5rmql\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.797031 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.796715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-trusted-ca\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.898522 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898474 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-installation-pull-secrets\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.898691 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898534 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmt9c\" (UniqueName: \"kubernetes.io/projected/535dfd9c-5e07-4e18-886d-57be1138629f-kube-api-access-dmt9c\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:48:35.898691 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74b52111-9c5e-4b37-ab68-e34630312fcb-config-volume\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.898691 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898618 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-image-registry-private-configuration\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.898691 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-ca-trust-extracted\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.898691 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-bound-sa-token\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.898922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898723 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-certificates\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.898922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74b52111-9c5e-4b37-ab68-e34630312fcb-tmp-dir\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.898922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.898922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:48:35.898922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898845 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.898922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm9mr\" (UniqueName: \"kubernetes.io/projected/74b52111-9c5e-4b37-ab68-e34630312fcb-kube-api-access-pm9mr\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.898922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898907 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rmql\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-kube-api-access-5rmql\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.899272 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.898950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-trusted-ca\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.901155 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.900939 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-trusted-ca\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.901284 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:35.900942 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:35.901284 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:35.901185 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b8c6f94d8-cghq8: secret "image-registry-tls" not found Apr 16 16:48:35.901284 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:35.901256 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls podName:fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:36.401238791 +0000 UTC m=+33.569950147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls") pod "image-registry-b8c6f94d8-cghq8" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4") : secret "image-registry-tls" not found Apr 16 16:48:35.901759 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.901525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-ca-trust-extracted\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.901759 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:35.901663 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:35.901759 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:35.901731 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls podName:74b52111-9c5e-4b37-ab68-e34630312fcb nodeName:}" failed. No retries permitted until 2026-04-16 16:48:36.401713832 +0000 UTC m=+33.570425185 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls") pod "dns-default-k7p77" (UID: "74b52111-9c5e-4b37-ab68-e34630312fcb") : secret "dns-default-metrics-tls" not found Apr 16 16:48:35.901759 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:35.901741 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:35.902032 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:35.901818 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert podName:535dfd9c-5e07-4e18-886d-57be1138629f nodeName:}" failed. No retries permitted until 2026-04-16 16:48:36.401776353 +0000 UTC m=+33.570487707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert") pod "ingress-canary-bdzw7" (UID: "535dfd9c-5e07-4e18-886d-57be1138629f") : secret "canary-serving-cert" not found Apr 16 16:48:35.902032 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.901282 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-certificates\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.902211 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.902193 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74b52111-9c5e-4b37-ab68-e34630312fcb-tmp-dir\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.902681 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.902660 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74b52111-9c5e-4b37-ab68-e34630312fcb-config-volume\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.905190 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.905166 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-image-registry-private-configuration\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.905296 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.905222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-installation-pull-secrets\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.911307 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.911260 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rmql\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-kube-api-access-5rmql\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:35.912235 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.912217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm9mr\" (UniqueName: \"kubernetes.io/projected/74b52111-9c5e-4b37-ab68-e34630312fcb-kube-api-access-pm9mr\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:35.912396 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.912369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmt9c\" (UniqueName: \"kubernetes.io/projected/535dfd9c-5e07-4e18-886d-57be1138629f-kube-api-access-dmt9c\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:48:35.912665 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:35.912645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-bound-sa-token\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:36.201433 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:36.201401 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:36.201621 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.201511 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:36.201621 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.201569 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs podName:890f4655-f936-4bb9-b82c-524efb501585 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:08.20155668 +0000 UTC m=+65.370268021 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs") pod "network-metrics-daemon-dv6f9" (UID: "890f4655-f936-4bb9-b82c-524efb501585") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:36.301890 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:36.301854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-564lt\" (UniqueName: \"kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt\") pod \"network-check-target-s7xbf\" (UID: \"feb3fcea-2282-411d-bb57-2562cc290f0a\") " pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:36.302087 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.302025 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:36.302087 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.302050 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:36.302087 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.302078 2572 projected.go:194] Error preparing data for projected volume kube-api-access-564lt for pod openshift-network-diagnostics/network-check-target-s7xbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:36.302239 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.302144 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt podName:feb3fcea-2282-411d-bb57-2562cc290f0a nodeName:}" failed. No retries permitted until 2026-04-16 16:49:08.302125638 +0000 UTC m=+65.470836979 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-564lt" (UniqueName: "kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt") pod "network-check-target-s7xbf" (UID: "feb3fcea-2282-411d-bb57-2562cc290f0a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:36.403245 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:36.403210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:36.403245 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:36.403247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:48:36.403515 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:36.403275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:36.403515 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.403359 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:36.403515 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.403378 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b8c6f94d8-cghq8: secret "image-registry-tls" not found Apr 16 16:48:36.403515 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.403374 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:36.403515 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.403402 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:36.403515 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.403443 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls podName:fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:37.403423762 +0000 UTC m=+34.572135120 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls") pod "image-registry-b8c6f94d8-cghq8" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4") : secret "image-registry-tls" not found Apr 16 16:48:36.403515 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.403462 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls podName:74b52111-9c5e-4b37-ab68-e34630312fcb nodeName:}" failed. No retries permitted until 2026-04-16 16:48:37.403450833 +0000 UTC m=+34.572162189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls") pod "dns-default-k7p77" (UID: "74b52111-9c5e-4b37-ab68-e34630312fcb") : secret "dns-default-metrics-tls" not found Apr 16 16:48:36.403515 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:36.403475 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert podName:535dfd9c-5e07-4e18-886d-57be1138629f nodeName:}" failed. No retries permitted until 2026-04-16 16:48:37.403468287 +0000 UTC m=+34.572179629 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert") pod "ingress-canary-bdzw7" (UID: "535dfd9c-5e07-4e18-886d-57be1138629f") : secret "canary-serving-cert" not found Apr 16 16:48:37.410672 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:37.410622 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:37.410672 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:37.410675 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:48:37.411175 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:37.410695 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:37.411175 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:37.410787 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:37.411175 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:37.410793 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:37.411175 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:37.410813 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b8c6f94d8-cghq8: secret "image-registry-tls" not found Apr 16 16:48:37.411175 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:37.410843 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls podName:74b52111-9c5e-4b37-ab68-e34630312fcb nodeName:}" failed. No retries permitted until 2026-04-16 16:48:39.41082852 +0000 UTC m=+36.579539887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls") pod "dns-default-k7p77" (UID: "74b52111-9c5e-4b37-ab68-e34630312fcb") : secret "dns-default-metrics-tls" not found Apr 16 16:48:37.411175 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:37.410787 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:37.411175 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:37.410867 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls podName:fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:39.410849053 +0000 UTC m=+36.579560411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls") pod "image-registry-b8c6f94d8-cghq8" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4") : secret "image-registry-tls" not found Apr 16 16:48:37.411175 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:37.410882 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert podName:535dfd9c-5e07-4e18-886d-57be1138629f nodeName:}" failed. No retries permitted until 2026-04-16 16:48:39.410875595 +0000 UTC m=+36.579586937 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert") pod "ingress-canary-bdzw7" (UID: "535dfd9c-5e07-4e18-886d-57be1138629f") : secret "canary-serving-cert" not found Apr 16 16:48:37.586617 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:37.586580 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:48:37.586830 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:37.586804 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:48:37.589980 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:37.589933 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v7sx7\"" Apr 16 16:48:37.590122 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:37.590012 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mkc4m\"" Apr 16 16:48:37.591010 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:37.590992 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:48:37.591124 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:37.591043 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:48:37.591124 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:37.591104 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:48:38.577097 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.577052 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nnbtw"] Apr 16 16:48:38.587977 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.587956 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nnbtw" Apr 16 16:48:38.588098 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.588037 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nnbtw"] Apr 16 16:48:38.590833 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.590815 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:48:38.721429 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.721398 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08-original-pull-secret\") pod \"global-pull-secret-syncer-nnbtw\" (UID: \"35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08\") " pod="kube-system/global-pull-secret-syncer-nnbtw" Apr 16 16:48:38.721543 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.721452 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08-dbus\") pod \"global-pull-secret-syncer-nnbtw\" (UID: \"35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08\") " pod="kube-system/global-pull-secret-syncer-nnbtw" Apr 16 16:48:38.721591 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.721540 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08-kubelet-config\") pod \"global-pull-secret-syncer-nnbtw\" (UID: \"35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08\") " pod="kube-system/global-pull-secret-syncer-nnbtw" Apr 16 16:48:38.809972 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.809943 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92gx2" event={"ID":"74ade214-8512-4cf5-93e8-0ece0e5776f2","Type":"ContainerStarted","Data":"e9244107c0c6d96c3bfc0ad4bb4dcaaeda8dcf1695483150f07b558696190113"} Apr 16 16:48:38.822644 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.822618 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08-kubelet-config\") pod \"global-pull-secret-syncer-nnbtw\" (UID: \"35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08\") " pod="kube-system/global-pull-secret-syncer-nnbtw" Apr 16 16:48:38.822732 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.822686 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08-original-pull-secret\") pod \"global-pull-secret-syncer-nnbtw\" (UID: \"35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08\") " pod="kube-system/global-pull-secret-syncer-nnbtw" Apr 16 16:48:38.822732 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.822724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08-dbus\") pod \"global-pull-secret-syncer-nnbtw\" (UID: \"35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08\") " pod="kube-system/global-pull-secret-syncer-nnbtw" Apr 16 16:48:38.822803 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.822735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08-kubelet-config\") pod \"global-pull-secret-syncer-nnbtw\" (UID: \"35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08\") " pod="kube-system/global-pull-secret-syncer-nnbtw" Apr 16 16:48:38.822914 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.822900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08-dbus\") pod \"global-pull-secret-syncer-nnbtw\" (UID: \"35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08\") " pod="kube-system/global-pull-secret-syncer-nnbtw" Apr 16 16:48:38.825240 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.825210 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08-original-pull-secret\") pod \"global-pull-secret-syncer-nnbtw\" (UID: \"35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08\") " pod="kube-system/global-pull-secret-syncer-nnbtw" Apr 16 16:48:38.896706 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:38.896669 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nnbtw" Apr 16 16:48:39.042731 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:39.042578 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nnbtw"] Apr 16 16:48:39.046008 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:39.045980 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35f5a8e4_d220_4bcd_bbc1_7031d2c0ad08.slice/crio-57465e1da00a948fd1157d571938c310a25957caccae9b4413b8b4df38b64ccb WatchSource:0}: Error finding container 57465e1da00a948fd1157d571938c310a25957caccae9b4413b8b4df38b64ccb: Status 404 returned error can't find the container with id 57465e1da00a948fd1157d571938c310a25957caccae9b4413b8b4df38b64ccb Apr 16 16:48:39.428340 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:39.428317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:39.428441 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:39.428348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:48:39.428441 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:39.428368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:39.428544 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:39.428483 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:39.428544 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:39.428501 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:39.428544 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:39.428530 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:39.428656 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:39.428546 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls podName:74b52111-9c5e-4b37-ab68-e34630312fcb nodeName:}" failed. No retries permitted until 2026-04-16 16:48:43.428533493 +0000 UTC m=+40.597244834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls") pod "dns-default-k7p77" (UID: "74b52111-9c5e-4b37-ab68-e34630312fcb") : secret "dns-default-metrics-tls" not found Apr 16 16:48:39.428656 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:39.428547 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b8c6f94d8-cghq8: secret "image-registry-tls" not found Apr 16 16:48:39.428656 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:39.428559 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert podName:535dfd9c-5e07-4e18-886d-57be1138629f nodeName:}" failed. No retries permitted until 2026-04-16 16:48:43.428553855 +0000 UTC m=+40.597265196 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert") pod "ingress-canary-bdzw7" (UID: "535dfd9c-5e07-4e18-886d-57be1138629f") : secret "canary-serving-cert" not found Apr 16 16:48:39.428656 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:39.428597 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls podName:fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:43.428579968 +0000 UTC m=+40.597291324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls") pod "image-registry-b8c6f94d8-cghq8" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4") : secret "image-registry-tls" not found Apr 16 16:48:39.814559 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:39.814485 2572 generic.go:358] "Generic (PLEG): container finished" podID="74ade214-8512-4cf5-93e8-0ece0e5776f2" containerID="e9244107c0c6d96c3bfc0ad4bb4dcaaeda8dcf1695483150f07b558696190113" exitCode=0 Apr 16 16:48:39.814559 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:39.814536 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92gx2" event={"ID":"74ade214-8512-4cf5-93e8-0ece0e5776f2","Type":"ContainerDied","Data":"e9244107c0c6d96c3bfc0ad4bb4dcaaeda8dcf1695483150f07b558696190113"} Apr 16 16:48:39.815940 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:39.815673 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nnbtw" event={"ID":"35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08","Type":"ContainerStarted","Data":"57465e1da00a948fd1157d571938c310a25957caccae9b4413b8b4df38b64ccb"} Apr 16 16:48:40.822151 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:40.821354 2572 generic.go:358] "Generic (PLEG): container finished" podID="74ade214-8512-4cf5-93e8-0ece0e5776f2" containerID="dc1caa4d6627e5c06c43624f6ff80fa047d8e90a38a004dd96493bcce6521d7b" exitCode=0 Apr 16 16:48:40.822151 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:40.821416 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92gx2" event={"ID":"74ade214-8512-4cf5-93e8-0ece0e5776f2","Type":"ContainerDied","Data":"dc1caa4d6627e5c06c43624f6ff80fa047d8e90a38a004dd96493bcce6521d7b"} Apr 16 16:48:41.826447 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:41.826412 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92gx2" event={"ID":"74ade214-8512-4cf5-93e8-0ece0e5776f2","Type":"ContainerStarted","Data":"925f9e7c0731c547a01109622d53117e5c291fdebf58718bccb1e74af3fc341f"} Apr 16 16:48:41.849781 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:41.849722 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-92gx2" podStartSLOduration=5.037135671 podStartE2EDuration="38.849708257s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:48:04.858914651 +0000 UTC m=+2.027625994" lastFinishedPulling="2026-04-16 16:48:38.671487221 +0000 UTC m=+35.840198580" observedRunningTime="2026-04-16 16:48:41.84888291 +0000 UTC m=+39.017594275" watchObservedRunningTime="2026-04-16 16:48:41.849708257 +0000 UTC m=+39.018419618" Apr 16 16:48:43.464937 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:43.464903 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:43.464937 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:43.464939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:48:43.465342 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:43.464962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:43.465342 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:43.465043 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:43.465342 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:43.465081 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b8c6f94d8-cghq8: secret "image-registry-tls" not found Apr 16 16:48:43.465342 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:43.465127 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls podName:fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:51.465112563 +0000 UTC m=+48.633823904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls") pod "image-registry-b8c6f94d8-cghq8" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4") : secret "image-registry-tls" not found Apr 16 16:48:43.465342 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:43.465046 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:43.465342 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:43.465192 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert podName:535dfd9c-5e07-4e18-886d-57be1138629f nodeName:}" failed. No retries permitted until 2026-04-16 16:48:51.465179225 +0000 UTC m=+48.633890569 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert") pod "ingress-canary-bdzw7" (UID: "535dfd9c-5e07-4e18-886d-57be1138629f") : secret "canary-serving-cert" not found Apr 16 16:48:43.465342 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:43.465050 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:43.465342 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:43.465222 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls podName:74b52111-9c5e-4b37-ab68-e34630312fcb nodeName:}" failed. No retries permitted until 2026-04-16 16:48:51.465213714 +0000 UTC m=+48.633925058 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls") pod "dns-default-k7p77" (UID: "74b52111-9c5e-4b37-ab68-e34630312fcb") : secret "dns-default-metrics-tls" not found Apr 16 16:48:43.831414 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:43.831332 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nnbtw" event={"ID":"35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08","Type":"ContainerStarted","Data":"d3937c6e6476303cd1b10e28826196edbae2ea688e0ec0d18efd4ef627d94b72"} Apr 16 16:48:43.847761 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:43.847699 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nnbtw" podStartSLOduration=2.098688246 podStartE2EDuration="5.847686586s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:48:39.048364139 +0000 UTC m=+36.217075480" lastFinishedPulling="2026-04-16 16:48:42.797362273 +0000 UTC m=+39.966073820" observedRunningTime="2026-04-16 16:48:43.847445162 +0000 UTC m=+41.016156526" watchObservedRunningTime="2026-04-16 16:48:43.847686586 +0000 UTC m=+41.016397949" Apr 16 16:48:46.988886 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:46.988851 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46"] Apr 16 16:48:46.990662 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:46.990646 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46" Apr 16 16:48:46.994458 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:46.994437 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-cdqw5\"" Apr 16 16:48:46.994458 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:46.994447 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 16:48:46.994624 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:46.994511 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 16:48:46.999256 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:46.999234 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46"] Apr 16 16:48:47.091330 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:47.091295 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2dns\" (UniqueName: \"kubernetes.io/projected/b8070376-8817-4c60-a98b-631847c5de08-kube-api-access-m2dns\") pod \"migrator-64d4d94569-wqb46\" (UID: \"b8070376-8817-4c60-a98b-631847c5de08\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46" Apr 16 16:48:47.191785 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:47.191763 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2dns\" (UniqueName: \"kubernetes.io/projected/b8070376-8817-4c60-a98b-631847c5de08-kube-api-access-m2dns\") pod \"migrator-64d4d94569-wqb46\" (UID: \"b8070376-8817-4c60-a98b-631847c5de08\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46" Apr 16 16:48:47.202152 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:47.202128 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2dns\" (UniqueName: \"kubernetes.io/projected/b8070376-8817-4c60-a98b-631847c5de08-kube-api-access-m2dns\") pod \"migrator-64d4d94569-wqb46\" (UID: \"b8070376-8817-4c60-a98b-631847c5de08\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46" Apr 16 16:48:47.299156 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:47.299108 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46" Apr 16 16:48:47.419874 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:47.419848 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46"] Apr 16 16:48:47.839351 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:47.839323 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46" event={"ID":"b8070376-8817-4c60-a98b-631847c5de08","Type":"ContainerStarted","Data":"f8c11b0f093708340f6b8e70ff393cff832323e9803419e8dfc18bdf95d5b98c"} Apr 16 16:48:49.844798 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:49.844764 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46" event={"ID":"b8070376-8817-4c60-a98b-631847c5de08","Type":"ContainerStarted","Data":"3d844a765c9f0237d761931f835c44477d69b3ff4a7cd1b9db54d1f5755e0758"} Apr 16 16:48:49.844798 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:49.844797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46" event={"ID":"b8070376-8817-4c60-a98b-631847c5de08","Type":"ContainerStarted","Data":"c88a25b5fa4e6a469cb173a1d7cd0e9e86b4419127370a86c4bbb5cfcc794079"} Apr 16 16:48:49.862566 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:49.862525 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wqb46" podStartSLOduration=2.372358408 podStartE2EDuration="3.862511574s" podCreationTimestamp="2026-04-16 16:48:46 +0000 UTC" firstStartedPulling="2026-04-16 16:48:47.427951319 +0000 UTC m=+44.596662661" lastFinishedPulling="2026-04-16 16:48:48.918104486 +0000 UTC m=+46.086815827" observedRunningTime="2026-04-16 16:48:49.860933107 +0000 UTC m=+47.029644470" watchObservedRunningTime="2026-04-16 16:48:49.862511574 +0000 UTC m=+47.031222937" Apr 16 16:48:49.923584 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:49.923559 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kmkmx"] Apr 16 16:48:49.925384 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:49.925371 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" Apr 16 16:48:49.929841 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:49.929813 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 16:48:49.929841 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:49.929813 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bl6cb\"" Apr 16 16:48:49.929998 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:49.929844 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 16:48:49.929998 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:49.929856 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 16:48:49.929998 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:49.929910 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 16:48:49.934796 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:49.934775 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kmkmx"] Apr 16 16:48:50.095355 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.095301 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p5bf5_b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269/dns-node-resolver/0.log" Apr 16 16:48:50.113755 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.113734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/216518cd-f791-45c1-968f-bba4c0ae219e-signing-key\") pod \"service-ca-bfc587fb7-kmkmx\" (UID: \"216518cd-f791-45c1-968f-bba4c0ae219e\") " pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" Apr 16 16:48:50.113813 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.113790 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k6z8\" (UniqueName: \"kubernetes.io/projected/216518cd-f791-45c1-968f-bba4c0ae219e-kube-api-access-7k6z8\") pod \"service-ca-bfc587fb7-kmkmx\" (UID: \"216518cd-f791-45c1-968f-bba4c0ae219e\") " pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" Apr 16 16:48:50.113858 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.113823 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/216518cd-f791-45c1-968f-bba4c0ae219e-signing-cabundle\") pod \"service-ca-bfc587fb7-kmkmx\" (UID: \"216518cd-f791-45c1-968f-bba4c0ae219e\") " pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" Apr 16 16:48:50.214406 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.214380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/216518cd-f791-45c1-968f-bba4c0ae219e-signing-key\") pod \"service-ca-bfc587fb7-kmkmx\" (UID: \"216518cd-f791-45c1-968f-bba4c0ae219e\") " pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" Apr 16 16:48:50.214498 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.214464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k6z8\" (UniqueName: \"kubernetes.io/projected/216518cd-f791-45c1-968f-bba4c0ae219e-kube-api-access-7k6z8\") pod \"service-ca-bfc587fb7-kmkmx\" (UID: \"216518cd-f791-45c1-968f-bba4c0ae219e\") " pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" Apr 16 16:48:50.214540 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.214500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/216518cd-f791-45c1-968f-bba4c0ae219e-signing-cabundle\") pod \"service-ca-bfc587fb7-kmkmx\" (UID: \"216518cd-f791-45c1-968f-bba4c0ae219e\") " pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" Apr 16 16:48:50.215165 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.215148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/216518cd-f791-45c1-968f-bba4c0ae219e-signing-cabundle\") pod \"service-ca-bfc587fb7-kmkmx\" (UID: \"216518cd-f791-45c1-968f-bba4c0ae219e\") " pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" Apr 16 16:48:50.216708 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.216692 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/216518cd-f791-45c1-968f-bba4c0ae219e-signing-key\") pod \"service-ca-bfc587fb7-kmkmx\" (UID: \"216518cd-f791-45c1-968f-bba4c0ae219e\") " pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" Apr 16 16:48:50.222557 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.222540 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k6z8\" (UniqueName: \"kubernetes.io/projected/216518cd-f791-45c1-968f-bba4c0ae219e-kube-api-access-7k6z8\") pod \"service-ca-bfc587fb7-kmkmx\" (UID: \"216518cd-f791-45c1-968f-bba4c0ae219e\") " pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" Apr 16 16:48:50.236352 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.236335 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" Apr 16 16:48:50.345029 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.345002 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kmkmx"] Apr 16 16:48:50.348255 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:48:50.348225 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod216518cd_f791_45c1_968f_bba4c0ae219e.slice/crio-528c7fc81a95e2f7ccbbddaef5230f9e0fe3ce029ed7ea2fcd20959047efa32b WatchSource:0}: Error finding container 528c7fc81a95e2f7ccbbddaef5230f9e0fe3ce029ed7ea2fcd20959047efa32b: Status 404 returned error can't find the container with id 528c7fc81a95e2f7ccbbddaef5230f9e0fe3ce029ed7ea2fcd20959047efa32b Apr 16 16:48:50.847655 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.847625 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" event={"ID":"216518cd-f791-45c1-968f-bba4c0ae219e","Type":"ContainerStarted","Data":"528c7fc81a95e2f7ccbbddaef5230f9e0fe3ce029ed7ea2fcd20959047efa32b"} Apr 16 16:48:50.894705 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:50.894677 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9tnfz_caaa1d46-b551-4960-b546-994b0ee36fed/node-ca/0.log" Apr 16 16:48:51.522134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:51.522098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:48:51.522134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:51.522140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:48:51.522337 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:51.522243 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:51.522337 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:51.522262 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b8c6f94d8-cghq8: secret "image-registry-tls" not found Apr 16 16:48:51.522337 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:51.522264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:48:51.522337 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:51.522322 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls podName:fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:07.522293782 +0000 UTC m=+64.691005123 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls") pod "image-registry-b8c6f94d8-cghq8" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4") : secret "image-registry-tls" not found Apr 16 16:48:51.522337 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:51.522243 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:51.522589 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:51.522402 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:51.522589 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:51.522410 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert podName:535dfd9c-5e07-4e18-886d-57be1138629f nodeName:}" failed. No retries permitted until 2026-04-16 16:49:07.522391032 +0000 UTC m=+64.691102383 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert") pod "ingress-canary-bdzw7" (UID: "535dfd9c-5e07-4e18-886d-57be1138629f") : secret "canary-serving-cert" not found Apr 16 16:48:51.522589 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:48:51.522441 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls podName:74b52111-9c5e-4b37-ab68-e34630312fcb nodeName:}" failed. No retries permitted until 2026-04-16 16:49:07.522430946 +0000 UTC m=+64.691142288 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls") pod "dns-default-k7p77" (UID: "74b52111-9c5e-4b37-ab68-e34630312fcb") : secret "dns-default-metrics-tls" not found Apr 16 16:48:51.895715 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:51.895686 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-wqb46_b8070376-8817-4c60-a98b-631847c5de08/migrator/0.log" Apr 16 16:48:52.094660 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:52.094641 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-wqb46_b8070376-8817-4c60-a98b-631847c5de08/graceful-termination/0.log" Apr 16 16:48:52.852524 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:52.852490 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" event={"ID":"216518cd-f791-45c1-968f-bba4c0ae219e","Type":"ContainerStarted","Data":"a80770600a415d0e4c222781c8bb0e1d94f5375b3933433cf066d9933ac5e7ee"} Apr 16 16:48:52.869608 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:48:52.869566 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-kmkmx" podStartSLOduration=2.173037372 podStartE2EDuration="3.869552029s" podCreationTimestamp="2026-04-16 16:48:49 +0000 UTC" firstStartedPulling="2026-04-16 16:48:50.350019088 +0000 UTC m=+47.518730429" lastFinishedPulling="2026-04-16 16:48:52.04653374 +0000 UTC m=+49.215245086" observedRunningTime="2026-04-16 16:48:52.869116739 +0000 UTC m=+50.037828104" watchObservedRunningTime="2026-04-16 16:48:52.869552029 +0000 UTC m=+50.038263392" Apr 16 16:49:01.799825 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:01.799795 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brhp4" Apr 16 16:49:07.538788 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.538754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:49:07.538788 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.538790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:49:07.539227 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.538809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:49:07.541203 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.541179 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74b52111-9c5e-4b37-ab68-e34630312fcb-metrics-tls\") pod \"dns-default-k7p77\" (UID: \"74b52111-9c5e-4b37-ab68-e34630312fcb\") " pod="openshift-dns/dns-default-k7p77" Apr 16 16:49:07.541203 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.541198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535dfd9c-5e07-4e18-886d-57be1138629f-cert\") pod \"ingress-canary-bdzw7\" (UID: \"535dfd9c-5e07-4e18-886d-57be1138629f\") " pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:49:07.541322 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.541263 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls\") pod \"image-registry-b8c6f94d8-cghq8\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:49:07.800350 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.800275 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lm58j\"" Apr 16 16:49:07.808445 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.808424 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:49:07.814583 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.814564 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fwjqf\"" Apr 16 16:49:07.822787 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.822764 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k7p77" Apr 16 16:49:07.833264 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.833240 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gwwfm\"" Apr 16 16:49:07.840982 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.840958 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bdzw7" Apr 16 16:49:07.954094 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.954052 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k7p77"] Apr 16 16:49:07.957666 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:07.957621 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74b52111_9c5e_4b37_ab68_e34630312fcb.slice/crio-54bfb781f3d022b5da11687e4691caac3dc68a4be2c6240769e6ad812f9df1a3 WatchSource:0}: Error finding container 54bfb781f3d022b5da11687e4691caac3dc68a4be2c6240769e6ad812f9df1a3: Status 404 returned error can't find the container with id 54bfb781f3d022b5da11687e4691caac3dc68a4be2c6240769e6ad812f9df1a3 Apr 16 16:49:07.973247 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.973223 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b8c6f94d8-cghq8"] Apr 16 16:49:07.976563 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:07.976538 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa4c43d0_f045_46fe_9f44_2a2fb4a7e9d4.slice/crio-2007da6542bfc20a44bc0b16fc764a8e11e2702847e20f5e0c273dd26a935d27 WatchSource:0}: Error finding container 2007da6542bfc20a44bc0b16fc764a8e11e2702847e20f5e0c273dd26a935d27: Status 404 returned error can't find the container with id 2007da6542bfc20a44bc0b16fc764a8e11e2702847e20f5e0c273dd26a935d27 Apr 16 16:49:07.983443 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:07.983422 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bdzw7"] Apr 16 16:49:07.986022 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:07.986001 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod535dfd9c_5e07_4e18_886d_57be1138629f.slice/crio-3c35484205fb33828da69867dcfd909fc34b3f11e778f87739af1fe99650cfb5 WatchSource:0}: Error finding container 3c35484205fb33828da69867dcfd909fc34b3f11e778f87739af1fe99650cfb5: Status 404 returned error can't find the container with id 3c35484205fb33828da69867dcfd909fc34b3f11e778f87739af1fe99650cfb5 Apr 16 16:49:08.244383 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.244348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:49:08.247405 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.247383 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:49:08.257481 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.257456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/890f4655-f936-4bb9-b82c-524efb501585-metrics-certs\") pod \"network-metrics-daemon-dv6f9\" (UID: \"890f4655-f936-4bb9-b82c-524efb501585\") " pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:49:08.344981 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.344913 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-564lt\" (UniqueName: \"kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt\") pod \"network-check-target-s7xbf\" (UID: \"feb3fcea-2282-411d-bb57-2562cc290f0a\") " pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:49:08.348030 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.348014 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:49:08.357922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.357899 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:49:08.368592 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.368564 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-564lt\" (UniqueName: \"kubernetes.io/projected/feb3fcea-2282-411d-bb57-2562cc290f0a-kube-api-access-564lt\") pod \"network-check-target-s7xbf\" (UID: \"feb3fcea-2282-411d-bb57-2562cc290f0a\") " pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:49:08.500581 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.500549 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mkc4m\"" Apr 16 16:49:08.505030 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.504997 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v7sx7\"" Apr 16 16:49:08.508155 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.508129 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:49:08.512993 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.512891 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv6f9" Apr 16 16:49:08.663206 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.663174 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s7xbf"] Apr 16 16:49:08.666643 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:08.666614 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeb3fcea_2282_411d_bb57_2562cc290f0a.slice/crio-47752dff99becf1dfa06124bec7527926b18fce930e927b51e0c0eff6430965c WatchSource:0}: Error finding container 47752dff99becf1dfa06124bec7527926b18fce930e927b51e0c0eff6430965c: Status 404 returned error can't find the container with id 47752dff99becf1dfa06124bec7527926b18fce930e927b51e0c0eff6430965c Apr 16 16:49:08.675210 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.675185 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dv6f9"] Apr 16 16:49:08.678175 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:08.678147 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod890f4655_f936_4bb9_b82c_524efb501585.slice/crio-91795d7089d9ccded56f74d5aa60158071ba4f1645cd1a1d5a7408b4125c01c7 WatchSource:0}: Error finding container 91795d7089d9ccded56f74d5aa60158071ba4f1645cd1a1d5a7408b4125c01c7: Status 404 returned error can't find the container with id 91795d7089d9ccded56f74d5aa60158071ba4f1645cd1a1d5a7408b4125c01c7 Apr 16 16:49:08.884723 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.884671 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bdzw7" event={"ID":"535dfd9c-5e07-4e18-886d-57be1138629f","Type":"ContainerStarted","Data":"3c35484205fb33828da69867dcfd909fc34b3f11e778f87739af1fe99650cfb5"} Apr 16 16:49:08.886654 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.886410 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" event={"ID":"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4","Type":"ContainerStarted","Data":"68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751"} Apr 16 16:49:08.886654 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.886470 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" event={"ID":"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4","Type":"ContainerStarted","Data":"2007da6542bfc20a44bc0b16fc764a8e11e2702847e20f5e0c273dd26a935d27"} Apr 16 16:49:08.886654 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.886526 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:49:08.887745 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.887648 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s7xbf" event={"ID":"feb3fcea-2282-411d-bb57-2562cc290f0a","Type":"ContainerStarted","Data":"47752dff99becf1dfa06124bec7527926b18fce930e927b51e0c0eff6430965c"} Apr 16 16:49:08.888772 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.888734 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k7p77" event={"ID":"74b52111-9c5e-4b37-ab68-e34630312fcb","Type":"ContainerStarted","Data":"54bfb781f3d022b5da11687e4691caac3dc68a4be2c6240769e6ad812f9df1a3"} Apr 16 16:49:08.889865 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.889831 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dv6f9" event={"ID":"890f4655-f936-4bb9-b82c-524efb501585","Type":"ContainerStarted","Data":"91795d7089d9ccded56f74d5aa60158071ba4f1645cd1a1d5a7408b4125c01c7"} Apr 16 16:49:08.905157 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:08.905113 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" podStartSLOduration=35.905098702 podStartE2EDuration="35.905098702s" podCreationTimestamp="2026-04-16 16:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:49:08.90482659 +0000 UTC m=+66.073537954" watchObservedRunningTime="2026-04-16 16:49:08.905098702 +0000 UTC m=+66.073810065" Apr 16 16:49:09.660686 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.660590 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-rgfgm"] Apr 16 16:49:09.679045 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.678999 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b8c6f94d8-cghq8"] Apr 16 16:49:09.679520 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.679475 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-rgfgm" Apr 16 16:49:09.679520 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.679496 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-rgfgm"] Apr 16 16:49:09.684582 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.683549 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 16:49:09.684582 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.683686 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-d7jkh\"" Apr 16 16:49:09.684582 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.683768 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 16:49:09.757023 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.756991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q5jt\" (UniqueName: \"kubernetes.io/projected/20b249ed-1937-4b5b-b328-8a3db9d456fc-kube-api-access-8q5jt\") pod \"downloads-586b57c7b4-rgfgm\" (UID: \"20b249ed-1937-4b5b-b328-8a3db9d456fc\") " pod="openshift-console/downloads-586b57c7b4-rgfgm" Apr 16 16:49:09.761528 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.761505 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-p6ljq"] Apr 16 16:49:09.775718 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.775679 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5b8c7794b5-ntls8"] Apr 16 16:49:09.777194 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.775832 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.781281 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.780452 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jjdf6\"" Apr 16 16:49:09.781281 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.780696 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:49:09.781281 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.780913 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:49:09.781281 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.780948 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:49:09.781281 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.780948 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:49:09.790014 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.789993 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd"] Apr 16 16:49:09.790174 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.790156 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.803532 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.802648 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd"] Apr 16 16:49:09.803532 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.802671 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-p6ljq"] Apr 16 16:49:09.803532 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.802683 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b8c7794b5-ntls8"] Apr 16 16:49:09.803532 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.802778 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" Apr 16 16:49:09.806017 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.805983 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-xlb7k\"" Apr 16 16:49:09.806251 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.806233 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 16:49:09.857946 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.857916 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/18cb8262-52e8-4ba6-87cb-9349969eed30-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.858134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.857961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q5jt\" (UniqueName: \"kubernetes.io/projected/20b249ed-1937-4b5b-b328-8a3db9d456fc-kube-api-access-8q5jt\") pod \"downloads-586b57c7b4-rgfgm\" (UID: \"20b249ed-1937-4b5b-b328-8a3db9d456fc\") " pod="openshift-console/downloads-586b57c7b4-rgfgm" Apr 16 16:49:09.858134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.857996 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/18cb8262-52e8-4ba6-87cb-9349969eed30-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.858134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858026 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-image-registry-private-configuration\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.858134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858050 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pkrc\" (UniqueName: \"kubernetes.io/projected/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-kube-api-access-9pkrc\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.858134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858100 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-trusted-ca\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.858427 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/18cb8262-52e8-4ba6-87cb-9349969eed30-data-volume\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.858427 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858184 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-registry-tls\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.858427 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-ca-trust-extracted\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.858427 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858237 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-bound-sa-token\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.858427 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858330 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-installation-pull-secrets\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.858427 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxx27\" (UniqueName: \"kubernetes.io/projected/18cb8262-52e8-4ba6-87cb-9349969eed30-kube-api-access-mxx27\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.858682 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858445 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/18cb8262-52e8-4ba6-87cb-9349969eed30-crio-socket\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.858682 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858469 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0f02da32-cecf-4794-98ac-cfe18237c3e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-h2crd\" (UID: \"0f02da32-cecf-4794-98ac-cfe18237c3e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" Apr 16 16:49:09.858682 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.858494 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-registry-certificates\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.868614 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.868570 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q5jt\" (UniqueName: \"kubernetes.io/projected/20b249ed-1937-4b5b-b328-8a3db9d456fc-kube-api-access-8q5jt\") pod \"downloads-586b57c7b4-rgfgm\" (UID: \"20b249ed-1937-4b5b-b328-8a3db9d456fc\") " pod="openshift-console/downloads-586b57c7b4-rgfgm" Apr 16 16:49:09.959005 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.958931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxx27\" (UniqueName: \"kubernetes.io/projected/18cb8262-52e8-4ba6-87cb-9349969eed30-kube-api-access-mxx27\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.959005 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.958976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/18cb8262-52e8-4ba6-87cb-9349969eed30-crio-socket\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.959223 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0f02da32-cecf-4794-98ac-cfe18237c3e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-h2crd\" (UID: \"0f02da32-cecf-4794-98ac-cfe18237c3e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" Apr 16 16:49:09.959223 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-registry-certificates\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.959223 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/18cb8262-52e8-4ba6-87cb-9349969eed30-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.959223 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/18cb8262-52e8-4ba6-87cb-9349969eed30-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.959223 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-image-registry-private-configuration\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.959223 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/18cb8262-52e8-4ba6-87cb-9349969eed30-crio-socket\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.959732 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959186 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pkrc\" (UniqueName: \"kubernetes.io/projected/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-kube-api-access-9pkrc\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.959815 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959771 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/18cb8262-52e8-4ba6-87cb-9349969eed30-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.959815 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959792 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-trusted-ca\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.959922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959862 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/18cb8262-52e8-4ba6-87cb-9349969eed30-data-volume\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.959922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-registry-tls\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.959922 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-ca-trust-extracted\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.960088 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-bound-sa-token\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.960088 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.959970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-registry-certificates\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.960088 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.960017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-installation-pull-secrets\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.961460 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.960367 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-ca-trust-extracted\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.961460 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.960616 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/18cb8262-52e8-4ba6-87cb-9349969eed30-data-volume\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.961460 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:49:09.961317 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 16:49:09.961460 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:49:09.961389 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f02da32-cecf-4794-98ac-cfe18237c3e1-tls-certificates podName:0f02da32-cecf-4794-98ac-cfe18237c3e1 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:10.461368844 +0000 UTC m=+67.630080186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/0f02da32-cecf-4794-98ac-cfe18237c3e1-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-h2crd" (UID: "0f02da32-cecf-4794-98ac-cfe18237c3e1") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 16:49:09.961460 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.961449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-trusted-ca\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.962967 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.962938 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/18cb8262-52e8-4ba6-87cb-9349969eed30-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:09.963121 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.963099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-image-registry-private-configuration\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.964725 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.964698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-installation-pull-secrets\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.965102 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.965058 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-registry-tls\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.973368 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.973328 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pkrc\" (UniqueName: \"kubernetes.io/projected/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-kube-api-access-9pkrc\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.987724 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.987682 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12ccdd0e-4d6f-4501-bcff-5d8a87aa8417-bound-sa-token\") pod \"image-registry-5b8c7794b5-ntls8\" (UID: \"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417\") " pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:09.994497 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.994473 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-rgfgm" Apr 16 16:49:09.999140 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:09.999122 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxx27\" (UniqueName: \"kubernetes.io/projected/18cb8262-52e8-4ba6-87cb-9349969eed30-kube-api-access-mxx27\") pod \"insights-runtime-extractor-p6ljq\" (UID: \"18cb8262-52e8-4ba6-87cb-9349969eed30\") " pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:10.089763 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.089739 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-p6ljq" Apr 16 16:49:10.102832 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.102650 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:10.465053 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.465016 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0f02da32-cecf-4794-98ac-cfe18237c3e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-h2crd\" (UID: \"0f02da32-cecf-4794-98ac-cfe18237c3e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" Apr 16 16:49:10.465255 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:49:10.465179 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 16:49:10.465320 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:49:10.465254 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f02da32-cecf-4794-98ac-cfe18237c3e1-tls-certificates podName:0f02da32-cecf-4794-98ac-cfe18237c3e1 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:11.465234957 +0000 UTC m=+68.633946311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/0f02da32-cecf-4794-98ac-cfe18237c3e1-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-h2crd" (UID: "0f02da32-cecf-4794-98ac-cfe18237c3e1") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 16:49:10.759003 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.758944 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-rgfgm"] Apr 16 16:49:10.781289 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:10.781259 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20b249ed_1937_4b5b_b328_8a3db9d456fc.slice/crio-f4eccd8fdeabd43da1fcc413a6008e39b9385814d42b9cbf1d2836972701823a WatchSource:0}: Error finding container f4eccd8fdeabd43da1fcc413a6008e39b9385814d42b9cbf1d2836972701823a: Status 404 returned error can't find the container with id f4eccd8fdeabd43da1fcc413a6008e39b9385814d42b9cbf1d2836972701823a Apr 16 16:49:10.790373 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.790345 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b8c7794b5-ntls8"] Apr 16 16:49:10.803053 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:10.803017 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12ccdd0e_4d6f_4501_bcff_5d8a87aa8417.slice/crio-2347b2148b39ed35f49c442cb83a660b10a74c75c62499507f79e1e48b52c662 WatchSource:0}: Error finding container 2347b2148b39ed35f49c442cb83a660b10a74c75c62499507f79e1e48b52c662: Status 404 returned error can't find the container with id 2347b2148b39ed35f49c442cb83a660b10a74c75c62499507f79e1e48b52c662 Apr 16 16:49:10.904122 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.903378 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bdzw7" event={"ID":"535dfd9c-5e07-4e18-886d-57be1138629f","Type":"ContainerStarted","Data":"bf0f19fa33181534f28d651dde6394ede5d6c1c2fce1c29cff57e6da03943919"} Apr 16 16:49:10.915079 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.912787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k7p77" event={"ID":"74b52111-9c5e-4b37-ab68-e34630312fcb","Type":"ContainerStarted","Data":"28c2f85b16ece25b3803806c58d955121e00486fc51c06d38ffbcbff419838a1"} Apr 16 16:49:10.920609 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.919890 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bdzw7" podStartSLOduration=33.325193466 podStartE2EDuration="35.919874177s" podCreationTimestamp="2026-04-16 16:48:35 +0000 UTC" firstStartedPulling="2026-04-16 16:49:07.987690121 +0000 UTC m=+65.156401462" lastFinishedPulling="2026-04-16 16:49:10.58237083 +0000 UTC m=+67.751082173" observedRunningTime="2026-04-16 16:49:10.919189972 +0000 UTC m=+68.087901556" watchObservedRunningTime="2026-04-16 16:49:10.919874177 +0000 UTC m=+68.088585545" Apr 16 16:49:10.920609 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.920003 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dv6f9" event={"ID":"890f4655-f936-4bb9-b82c-524efb501585","Type":"ContainerStarted","Data":"c9465567a5057f0e8041eb72a66f69b5d07d326a96d5aadce0555864e22f2135"} Apr 16 16:49:10.925051 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.925026 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" event={"ID":"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417","Type":"ContainerStarted","Data":"9dc902965b81c167066fe660fc771ea3a74e445283ef1852a0bb54a0b4e6f88b"} Apr 16 16:49:10.925150 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.925080 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" event={"ID":"12ccdd0e-4d6f-4501-bcff-5d8a87aa8417","Type":"ContainerStarted","Data":"2347b2148b39ed35f49c442cb83a660b10a74c75c62499507f79e1e48b52c662"} Apr 16 16:49:10.925755 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.925735 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:10.929660 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.929454 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-rgfgm" event={"ID":"20b249ed-1937-4b5b-b328-8a3db9d456fc","Type":"ContainerStarted","Data":"f4eccd8fdeabd43da1fcc413a6008e39b9385814d42b9cbf1d2836972701823a"} Apr 16 16:49:10.946007 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:10.945762 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" podStartSLOduration=1.9457456340000001 podStartE2EDuration="1.945745634s" podCreationTimestamp="2026-04-16 16:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:49:10.944298792 +0000 UTC m=+68.113010156" watchObservedRunningTime="2026-04-16 16:49:10.945745634 +0000 UTC m=+68.114456998" Apr 16 16:49:11.015708 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:11.015675 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-p6ljq"] Apr 16 16:49:11.477079 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:11.477024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0f02da32-cecf-4794-98ac-cfe18237c3e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-h2crd\" (UID: \"0f02da32-cecf-4794-98ac-cfe18237c3e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" Apr 16 16:49:11.479435 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:11.479411 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0f02da32-cecf-4794-98ac-cfe18237c3e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-h2crd\" (UID: \"0f02da32-cecf-4794-98ac-cfe18237c3e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" Apr 16 16:49:11.613368 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:11.613334 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" Apr 16 16:49:11.933212 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:11.933176 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k7p77" event={"ID":"74b52111-9c5e-4b37-ab68-e34630312fcb","Type":"ContainerStarted","Data":"4ff6e0b3d07e7ae51c5121561e5a2252affced8a5b89a10e80df8d1443d81751"} Apr 16 16:49:11.933631 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:11.933327 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-k7p77" Apr 16 16:49:11.934633 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:11.934604 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dv6f9" event={"ID":"890f4655-f936-4bb9-b82c-524efb501585","Type":"ContainerStarted","Data":"88af3e3004d35f5c937f0e1d74099a00884ef0db8a8922feba024bc6b19c6cce"} Apr 16 16:49:11.943445 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:11.943422 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18cb8262_52e8_4ba6_87cb_9349969eed30.slice/crio-8c01b47a626ec63ca754cadd7e824e26aba94d5b266017ff4f8a776f718563fa WatchSource:0}: Error finding container 8c01b47a626ec63ca754cadd7e824e26aba94d5b266017ff4f8a776f718563fa: Status 404 returned error can't find the container with id 8c01b47a626ec63ca754cadd7e824e26aba94d5b266017ff4f8a776f718563fa Apr 16 16:49:11.951388 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:11.951338 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-k7p77" podStartSLOduration=34.331446937 podStartE2EDuration="36.951323159s" podCreationTimestamp="2026-04-16 16:48:35 +0000 UTC" firstStartedPulling="2026-04-16 16:49:07.959937589 +0000 UTC m=+65.128648931" lastFinishedPulling="2026-04-16 16:49:10.579813797 +0000 UTC m=+67.748525153" observedRunningTime="2026-04-16 16:49:11.95048517 +0000 UTC m=+69.119196534" watchObservedRunningTime="2026-04-16 16:49:11.951323159 +0000 UTC m=+69.120034523" Apr 16 16:49:11.966970 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:11.966921 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dv6f9" podStartSLOduration=67.027476169 podStartE2EDuration="1m8.96690378s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:49:08.680672671 +0000 UTC m=+65.849384017" lastFinishedPulling="2026-04-16 16:49:10.620100281 +0000 UTC m=+67.788811628" observedRunningTime="2026-04-16 16:49:11.966730539 +0000 UTC m=+69.135441903" watchObservedRunningTime="2026-04-16 16:49:11.96690378 +0000 UTC m=+69.135615143" Apr 16 16:49:12.076719 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:12.076693 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd"] Apr 16 16:49:12.079337 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:12.079313 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f02da32_cecf_4794_98ac_cfe18237c3e1.slice/crio-d48f9d8dbd435cf17d2fd2f359166936998f312b53c120840e29aa1e545837f2 WatchSource:0}: Error finding container d48f9d8dbd435cf17d2fd2f359166936998f312b53c120840e29aa1e545837f2: Status 404 returned error can't find the container with id d48f9d8dbd435cf17d2fd2f359166936998f312b53c120840e29aa1e545837f2 Apr 16 16:49:12.938993 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:12.938959 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" event={"ID":"0f02da32-cecf-4794-98ac-cfe18237c3e1","Type":"ContainerStarted","Data":"d48f9d8dbd435cf17d2fd2f359166936998f312b53c120840e29aa1e545837f2"} Apr 16 16:49:12.940544 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:12.940515 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s7xbf" event={"ID":"feb3fcea-2282-411d-bb57-2562cc290f0a","Type":"ContainerStarted","Data":"fd3c3d978fd3361c123bd22392272f23e65d642e316548bd10b88911e9a19d13"} Apr 16 16:49:12.940699 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:12.940672 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:49:12.942222 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:12.942193 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-p6ljq" event={"ID":"18cb8262-52e8-4ba6-87cb-9349969eed30","Type":"ContainerStarted","Data":"83cac77042d253331226a4827357e27f2f79726ee3b839c4e12d5a975c65db48"} Apr 16 16:49:12.942325 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:12.942230 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-p6ljq" event={"ID":"18cb8262-52e8-4ba6-87cb-9349969eed30","Type":"ContainerStarted","Data":"9ef3e6b7e35729ee23a462166552097a18c5863a39389aeb8c4111b1ec53da89"} Apr 16 16:49:12.942325 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:12.942244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-p6ljq" event={"ID":"18cb8262-52e8-4ba6-87cb-9349969eed30","Type":"ContainerStarted","Data":"8c01b47a626ec63ca754cadd7e824e26aba94d5b266017ff4f8a776f718563fa"} Apr 16 16:49:12.957292 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:12.957248 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-s7xbf" podStartSLOduration=66.599754335 podStartE2EDuration="1m9.957232327s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:49:08.669128063 +0000 UTC m=+65.837839416" lastFinishedPulling="2026-04-16 16:49:12.026606067 +0000 UTC m=+69.195317408" observedRunningTime="2026-04-16 16:49:12.956359704 +0000 UTC m=+70.125071067" watchObservedRunningTime="2026-04-16 16:49:12.957232327 +0000 UTC m=+70.125943691" Apr 16 16:49:13.947149 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:13.947087 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" event={"ID":"0f02da32-cecf-4794-98ac-cfe18237c3e1","Type":"ContainerStarted","Data":"77069e50e1ac6995f9fa45f96558c3428b1b0cfb8c72fa30a8b5673459d53848"} Apr 16 16:49:13.965687 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:13.965635 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" podStartSLOduration=3.902383946 podStartE2EDuration="4.965621993s" podCreationTimestamp="2026-04-16 16:49:09 +0000 UTC" firstStartedPulling="2026-04-16 16:49:12.081021896 +0000 UTC m=+69.249733237" lastFinishedPulling="2026-04-16 16:49:13.144259926 +0000 UTC m=+70.312971284" observedRunningTime="2026-04-16 16:49:13.964338245 +0000 UTC m=+71.133049609" watchObservedRunningTime="2026-04-16 16:49:13.965621993 +0000 UTC m=+71.134333355" Apr 16 16:49:14.951743 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:14.951704 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-p6ljq" event={"ID":"18cb8262-52e8-4ba6-87cb-9349969eed30","Type":"ContainerStarted","Data":"e82e460fee96148ecb173fab4ee68768b75704ca37ed4fb199e126b43f20354a"} Apr 16 16:49:14.952191 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:14.951966 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" Apr 16 16:49:14.957239 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:14.957218 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h2crd" Apr 16 16:49:14.971724 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:14.971690 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-p6ljq" podStartSLOduration=3.821105222 podStartE2EDuration="5.971679949s" podCreationTimestamp="2026-04-16 16:49:09 +0000 UTC" firstStartedPulling="2026-04-16 16:49:12.022435044 +0000 UTC m=+69.191146389" lastFinishedPulling="2026-04-16 16:49:14.173009771 +0000 UTC m=+71.341721116" observedRunningTime="2026-04-16 16:49:14.970329862 +0000 UTC m=+72.139041210" watchObservedRunningTime="2026-04-16 16:49:14.971679949 +0000 UTC m=+72.140391312" Apr 16 16:49:15.594210 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.594170 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-576b986fc4-xfn7p"] Apr 16 16:49:15.606997 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.606970 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-576b986fc4-xfn7p"] Apr 16 16:49:15.607176 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.607106 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.609917 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.609894 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 16:49:15.609917 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.609905 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 16:49:15.610115 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.609978 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 16:49:15.610115 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.609894 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 16:49:15.611382 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.611333 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 16:49:15.611581 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.611566 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sdzgj\"" Apr 16 16:49:15.642135 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.642107 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7t7n4"] Apr 16 16:49:15.654570 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.654545 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7t7n4"] Apr 16 16:49:15.654693 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.654680 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.657426 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.657407 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:49:15.657551 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.657536 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:49:15.657614 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.657539 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 16:49:15.657828 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.657809 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:49:15.657911 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.657884 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gw4w9\"" Apr 16 16:49:15.657967 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.657923 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 16:49:15.714427 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.714405 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxjj8\" (UniqueName: \"kubernetes.io/projected/4bb1ed34-b734-410e-b2ba-452f8604f2ef-kube-api-access-rxjj8\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.714547 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.714445 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-oauth-serving-cert\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.714547 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.714473 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-serving-cert\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.714645 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.714554 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86sjk\" (UniqueName: \"kubernetes.io/projected/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-kube-api-access-86sjk\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.714645 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.714613 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-service-ca\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.714645 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.714637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.714786 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.714711 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.714786 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.714749 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-oauth-config\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.714786 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.714776 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-config\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.718782 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.714802 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.815691 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.815660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-service-ca\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.815807 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.815703 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.815807 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.815743 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.815807 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.815773 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-oauth-config\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.815807 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.815796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-config\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.816041 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.815819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.816041 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.815846 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxjj8\" (UniqueName: \"kubernetes.io/projected/4bb1ed34-b734-410e-b2ba-452f8604f2ef-kube-api-access-rxjj8\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.816041 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.815874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-oauth-serving-cert\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.816041 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.815900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-serving-cert\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.816041 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.815938 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86sjk\" (UniqueName: \"kubernetes.io/projected/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-kube-api-access-86sjk\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.816521 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.816496 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.816597 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.816508 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-service-ca\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.816756 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.816708 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-config\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.817036 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.817017 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-oauth-serving-cert\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.818686 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.818665 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-oauth-config\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.818778 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.818734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.818983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.818962 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.819122 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.819100 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-serving-cert\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.824450 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.824430 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86sjk\" (UniqueName: \"kubernetes.io/projected/ac362f0d-aa0b-4b45-bcdd-4549f444fd35-kube-api-access-86sjk\") pod \"prometheus-operator-78f957474d-7t7n4\" (UID: \"ac362f0d-aa0b-4b45-bcdd-4549f444fd35\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:15.825006 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.824983 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxjj8\" (UniqueName: \"kubernetes.io/projected/4bb1ed34-b734-410e-b2ba-452f8604f2ef-kube-api-access-rxjj8\") pod \"console-576b986fc4-xfn7p\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.924772 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.924748 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:15.965519 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:15.965489 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" Apr 16 16:49:16.066570 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:16.066516 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-576b986fc4-xfn7p"] Apr 16 16:49:16.070192 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:16.070155 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bb1ed34_b734_410e_b2ba_452f8604f2ef.slice/crio-bf96e6529b55cb0d8b1463ef119fd2edbe24f10658258c13dd5bf5fc6385a784 WatchSource:0}: Error finding container bf96e6529b55cb0d8b1463ef119fd2edbe24f10658258c13dd5bf5fc6385a784: Status 404 returned error can't find the container with id bf96e6529b55cb0d8b1463ef119fd2edbe24f10658258c13dd5bf5fc6385a784 Apr 16 16:49:16.109048 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:16.108990 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7t7n4"] Apr 16 16:49:16.111482 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:16.111454 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac362f0d_aa0b_4b45_bcdd_4549f444fd35.slice/crio-cc22d1d8af8fa921748f16f0d424074fd851118303058b7644b68366ce995405 WatchSource:0}: Error finding container cc22d1d8af8fa921748f16f0d424074fd851118303058b7644b68366ce995405: Status 404 returned error can't find the container with id cc22d1d8af8fa921748f16f0d424074fd851118303058b7644b68366ce995405 Apr 16 16:49:16.960543 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:16.960093 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576b986fc4-xfn7p" event={"ID":"4bb1ed34-b734-410e-b2ba-452f8604f2ef","Type":"ContainerStarted","Data":"bf96e6529b55cb0d8b1463ef119fd2edbe24f10658258c13dd5bf5fc6385a784"} Apr 16 16:49:16.962850 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:16.962789 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" event={"ID":"ac362f0d-aa0b-4b45-bcdd-4549f444fd35","Type":"ContainerStarted","Data":"cc22d1d8af8fa921748f16f0d424074fd851118303058b7644b68366ce995405"} Apr 16 16:49:17.968365 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:17.968326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" event={"ID":"ac362f0d-aa0b-4b45-bcdd-4549f444fd35","Type":"ContainerStarted","Data":"ec1219043511e89181578946ecd1f4e4a723aba5f607a9779fde81728ae4fddd"} Apr 16 16:49:17.968365 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:17.968368 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" event={"ID":"ac362f0d-aa0b-4b45-bcdd-4549f444fd35","Type":"ContainerStarted","Data":"42bb8cd43a0e1d5a0bceda7abf4e9b3cfb1f53b36bc3e077dc3d859e57da72bc"} Apr 16 16:49:17.993493 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:17.993351 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-7t7n4" podStartSLOduration=1.604110734 podStartE2EDuration="2.993336395s" podCreationTimestamp="2026-04-16 16:49:15 +0000 UTC" firstStartedPulling="2026-04-16 16:49:16.113627754 +0000 UTC m=+73.282339095" lastFinishedPulling="2026-04-16 16:49:17.502853411 +0000 UTC m=+74.671564756" observedRunningTime="2026-04-16 16:49:17.992222763 +0000 UTC m=+75.160934128" watchObservedRunningTime="2026-04-16 16:49:17.993336395 +0000 UTC m=+75.162047757" Apr 16 16:49:19.976254 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:19.976204 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576b986fc4-xfn7p" event={"ID":"4bb1ed34-b734-410e-b2ba-452f8604f2ef","Type":"ContainerStarted","Data":"c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c"} Apr 16 16:49:19.997150 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:19.997091 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-576b986fc4-xfn7p" podStartSLOduration=1.977462957 podStartE2EDuration="4.997074931s" podCreationTimestamp="2026-04-16 16:49:15 +0000 UTC" firstStartedPulling="2026-04-16 16:49:16.072742399 +0000 UTC m=+73.241453749" lastFinishedPulling="2026-04-16 16:49:19.092354381 +0000 UTC m=+76.261065723" observedRunningTime="2026-04-16 16:49:19.994968211 +0000 UTC m=+77.163679575" watchObservedRunningTime="2026-04-16 16:49:19.997074931 +0000 UTC m=+77.165786289" Apr 16 16:49:20.027476 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.027433 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dw57d"] Apr 16 16:49:20.030006 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.029974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.035306 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.035283 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:49:20.035431 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.035325 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:49:20.035431 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.035289 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:49:20.035558 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.035453 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lxqh8\"" Apr 16 16:49:20.150706 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.150314 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-accelerators-collector-config\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.150706 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.150368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-textfile\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.150706 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.150396 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a2eb8640-dd12-4e24-a36c-be01ef52908a-root\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.150706 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.150427 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.150706 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.150460 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-wtmp\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.150706 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.150486 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2eb8640-dd12-4e24-a36c-be01ef52908a-metrics-client-ca\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.150706 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.150515 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2eb8640-dd12-4e24-a36c-be01ef52908a-sys\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.150706 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.150537 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-tls\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.150706 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.150576 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btfqt\" (UniqueName: \"kubernetes.io/projected/a2eb8640-dd12-4e24-a36c-be01ef52908a-kube-api-access-btfqt\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.251738 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.251659 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-accelerators-collector-config\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.251738 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.251706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-textfile\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.251934 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.251844 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a2eb8640-dd12-4e24-a36c-be01ef52908a-root\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.251934 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.251895 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.252037 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.251936 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-wtmp\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.252037 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.251962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2eb8640-dd12-4e24-a36c-be01ef52908a-metrics-client-ca\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.252037 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.252000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2eb8640-dd12-4e24-a36c-be01ef52908a-sys\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.252037 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.252030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-tls\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.252263 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.252037 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-textfile\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.252263 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.252090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btfqt\" (UniqueName: \"kubernetes.io/projected/a2eb8640-dd12-4e24-a36c-be01ef52908a-kube-api-access-btfqt\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.252263 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.252194 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-wtmp\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.252263 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.252235 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a2eb8640-dd12-4e24-a36c-be01ef52908a-root\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.252450 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.252359 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-accelerators-collector-config\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.252450 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.252428 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2eb8640-dd12-4e24-a36c-be01ef52908a-sys\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.252550 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:49:20.252486 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:49:20.252550 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:49:20.252549 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-tls podName:a2eb8640-dd12-4e24-a36c-be01ef52908a nodeName:}" failed. No retries permitted until 2026-04-16 16:49:20.752530613 +0000 UTC m=+77.921241957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-tls") pod "node-exporter-dw57d" (UID: "a2eb8640-dd12-4e24-a36c-be01ef52908a") : secret "node-exporter-tls" not found Apr 16 16:49:20.252867 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.252823 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2eb8640-dd12-4e24-a36c-be01ef52908a-metrics-client-ca\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.254940 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.254884 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.261944 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.261877 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btfqt\" (UniqueName: \"kubernetes.io/projected/a2eb8640-dd12-4e24-a36c-be01ef52908a-kube-api-access-btfqt\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.756395 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.756358 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-tls\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.758993 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.758969 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a2eb8640-dd12-4e24-a36c-be01ef52908a-node-exporter-tls\") pod \"node-exporter-dw57d\" (UID: \"a2eb8640-dd12-4e24-a36c-be01ef52908a\") " pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.945721 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.945681 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dw57d" Apr 16 16:49:20.960421 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:20.959884 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2eb8640_dd12_4e24_a36c_be01ef52908a.slice/crio-9a67087d052569676629a65799889bf6a3b848671fdc9e73c8bf3e5c8c525f3b WatchSource:0}: Error finding container 9a67087d052569676629a65799889bf6a3b848671fdc9e73c8bf3e5c8c525f3b: Status 404 returned error can't find the container with id 9a67087d052569676629a65799889bf6a3b848671fdc9e73c8bf3e5c8c525f3b Apr 16 16:49:20.980001 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:20.979971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dw57d" event={"ID":"a2eb8640-dd12-4e24-a36c-be01ef52908a","Type":"ContainerStarted","Data":"9a67087d052569676629a65799889bf6a3b848671fdc9e73c8bf3e5c8c525f3b"} Apr 16 16:49:21.945510 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:21.945454 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-k7p77" Apr 16 16:49:21.987500 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:21.986713 2572 generic.go:358] "Generic (PLEG): container finished" podID="a2eb8640-dd12-4e24-a36c-be01ef52908a" containerID="8e34bbd2f8df6edf9b21cead7a46223406580d5103476e76fe80743e1193c63e" exitCode=0 Apr 16 16:49:21.987500 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:21.986803 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dw57d" event={"ID":"a2eb8640-dd12-4e24-a36c-be01ef52908a","Type":"ContainerDied","Data":"8e34bbd2f8df6edf9b21cead7a46223406580d5103476e76fe80743e1193c63e"} Apr 16 16:49:22.993617 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:22.993585 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dw57d" event={"ID":"a2eb8640-dd12-4e24-a36c-be01ef52908a","Type":"ContainerStarted","Data":"25259a8d858e16022f0baa953f4e8bc3b830b162ce0beae77c694121404f5e26"} Apr 16 16:49:22.993617 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:22.993621 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dw57d" event={"ID":"a2eb8640-dd12-4e24-a36c-be01ef52908a","Type":"ContainerStarted","Data":"1ff9b6174ed9e295454e0d2e13caa43fa066a508e2426ab4ae8a645d1368389e"} Apr 16 16:49:23.022443 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:23.022399 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dw57d" podStartSLOduration=2.293517552 podStartE2EDuration="3.022383652s" podCreationTimestamp="2026-04-16 16:49:20 +0000 UTC" firstStartedPulling="2026-04-16 16:49:20.960678486 +0000 UTC m=+78.129389828" lastFinishedPulling="2026-04-16 16:49:21.689544578 +0000 UTC m=+78.858255928" observedRunningTime="2026-04-16 16:49:23.020420354 +0000 UTC m=+80.189131717" watchObservedRunningTime="2026-04-16 16:49:23.022383652 +0000 UTC m=+80.191095015" Apr 16 16:49:25.925599 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:25.925562 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:25.926172 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:25.925876 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:25.931779 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:25.931759 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:26.007422 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:26.007389 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:28.599654 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:28.599620 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-576b986fc4-xfn7p"] Apr 16 16:49:30.014587 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:30.014548 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-rgfgm" event={"ID":"20b249ed-1937-4b5b-b328-8a3db9d456fc","Type":"ContainerStarted","Data":"8204efcd5be2af033af488273f53af4191ffda52ebe32c2ef56a5a1f8087f0d3"} Apr 16 16:49:30.015013 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:30.014756 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-rgfgm" Apr 16 16:49:30.022606 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:30.022577 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-rgfgm" Apr 16 16:49:30.034651 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:30.034580 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-rgfgm" podStartSLOduration=2.690101382 podStartE2EDuration="21.034563938s" podCreationTimestamp="2026-04-16 16:49:09 +0000 UTC" firstStartedPulling="2026-04-16 16:49:10.784353136 +0000 UTC m=+67.953064493" lastFinishedPulling="2026-04-16 16:49:29.128815696 +0000 UTC m=+86.297527049" observedRunningTime="2026-04-16 16:49:30.032489464 +0000 UTC m=+87.201200828" watchObservedRunningTime="2026-04-16 16:49:30.034563938 +0000 UTC m=+87.203275304" Apr 16 16:49:30.935618 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:30.935586 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:49:30.945463 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:30.945423 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5798d64495-6g9hk"] Apr 16 16:49:30.979668 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:30.979642 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5798d64495-6g9hk"] Apr 16 16:49:30.979822 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:30.979767 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:30.987959 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:30.987937 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 16:49:31.041465 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.041428 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-service-ca\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.041864 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.041475 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-oauth-serving-cert\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.041864 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.041512 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-trusted-ca-bundle\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.041864 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.041570 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-serving-cert\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.041864 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.041593 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-console-config\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.041864 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.041626 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxhr2\" (UniqueName: \"kubernetes.io/projected/b1400945-53f0-41e1-826d-2bab62fe3a95-kube-api-access-jxhr2\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.041864 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.041736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-oauth-config\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.142832 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.142791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-serving-cert\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.143016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.142837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-console-config\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.143016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.142882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxhr2\" (UniqueName: \"kubernetes.io/projected/b1400945-53f0-41e1-826d-2bab62fe3a95-kube-api-access-jxhr2\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.143016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.142939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-oauth-config\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.143016 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.142993 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-service-ca\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.143218 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.143018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-oauth-serving-cert\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.143218 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.143053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-trusted-ca-bundle\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.147098 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.144485 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-console-config\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.148162 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.148133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-trusted-ca-bundle\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.149344 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.149318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-service-ca\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.149559 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.149532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-oauth-config\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.149688 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.149668 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-serving-cert\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.150484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.150459 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-oauth-serving-cert\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.153000 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.152958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxhr2\" (UniqueName: \"kubernetes.io/projected/b1400945-53f0-41e1-826d-2bab62fe3a95-kube-api-access-jxhr2\") pod \"console-5798d64495-6g9hk\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.290677 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.290580 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:31.446794 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:31.446742 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5798d64495-6g9hk"] Apr 16 16:49:31.450982 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:49:31.450946 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1400945_53f0_41e1_826d_2bab62fe3a95.slice/crio-312776df46b7084e75eb094ebf573a5624d64c0f22b5bb83afa9fb8ac7a5d5f2 WatchSource:0}: Error finding container 312776df46b7084e75eb094ebf573a5624d64c0f22b5bb83afa9fb8ac7a5d5f2: Status 404 returned error can't find the container with id 312776df46b7084e75eb094ebf573a5624d64c0f22b5bb83afa9fb8ac7a5d5f2 Apr 16 16:49:32.022076 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:32.022019 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5798d64495-6g9hk" event={"ID":"b1400945-53f0-41e1-826d-2bab62fe3a95","Type":"ContainerStarted","Data":"28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783"} Apr 16 16:49:32.022076 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:32.022079 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5798d64495-6g9hk" event={"ID":"b1400945-53f0-41e1-826d-2bab62fe3a95","Type":"ContainerStarted","Data":"312776df46b7084e75eb094ebf573a5624d64c0f22b5bb83afa9fb8ac7a5d5f2"} Apr 16 16:49:32.040428 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:32.040380 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5798d64495-6g9hk" podStartSLOduration=2.040364107 podStartE2EDuration="2.040364107s" podCreationTimestamp="2026-04-16 16:49:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:49:32.038448702 +0000 UTC m=+89.207160066" watchObservedRunningTime="2026-04-16 16:49:32.040364107 +0000 UTC m=+89.209075471" Apr 16 16:49:32.947461 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:32.947432 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5b8c7794b5-ntls8" Apr 16 16:49:35.963544 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:35.963469 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" podUID="fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" containerName="registry" containerID="cri-o://68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751" gracePeriod=30 Apr 16 16:49:36.232461 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.232439 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:49:36.392251 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.392219 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls\") pod \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " Apr 16 16:49:36.392433 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.392274 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-bound-sa-token\") pod \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " Apr 16 16:49:36.392433 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.392309 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rmql\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-kube-api-access-5rmql\") pod \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " Apr 16 16:49:36.392433 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.392357 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-ca-trust-extracted\") pod \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " Apr 16 16:49:36.392433 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.392387 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-certificates\") pod \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " Apr 16 16:49:36.392433 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.392417 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-installation-pull-secrets\") pod \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " Apr 16 16:49:36.392671 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.392451 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-image-registry-private-configuration\") pod \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " Apr 16 16:49:36.392671 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.392476 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-trusted-ca\") pod \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\" (UID: \"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4\") " Apr 16 16:49:36.392882 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.392841 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:49:36.393032 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.392981 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:49:36.395040 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.394988 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:49:36.395040 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.395022 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:49:36.395209 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.395086 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:49:36.395380 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.395357 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-kube-api-access-5rmql" (OuterVolumeSpecName: "kube-api-access-5rmql") pod "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4"). InnerVolumeSpecName "kube-api-access-5rmql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:49:36.404579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.404558 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:49:36.405241 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.405212 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" (UID: "fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:49:36.493391 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.493312 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-tls\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:36.493391 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.493344 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-bound-sa-token\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:36.493391 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.493357 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5rmql\" (UniqueName: \"kubernetes.io/projected/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-kube-api-access-5rmql\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:36.493391 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.493371 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-ca-trust-extracted\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:36.493391 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.493383 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-registry-certificates\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:36.493680 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.493396 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-installation-pull-secrets\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:36.493680 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.493410 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-image-registry-private-configuration\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:36.493680 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:36.493422 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4-trusted-ca\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:37.038586 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:37.038548 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" containerID="68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751" exitCode=0 Apr 16 16:49:37.038967 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:37.038628 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" Apr 16 16:49:37.038967 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:37.038629 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" event={"ID":"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4","Type":"ContainerDied","Data":"68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751"} Apr 16 16:49:37.038967 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:37.038739 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b8c6f94d8-cghq8" event={"ID":"fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4","Type":"ContainerDied","Data":"2007da6542bfc20a44bc0b16fc764a8e11e2702847e20f5e0c273dd26a935d27"} Apr 16 16:49:37.038967 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:37.038755 2572 scope.go:117] "RemoveContainer" containerID="68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751" Apr 16 16:49:37.048008 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:37.047975 2572 scope.go:117] "RemoveContainer" containerID="68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751" Apr 16 16:49:37.048297 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:49:37.048278 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751\": container with ID starting with 68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751 not found: ID does not exist" containerID="68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751" Apr 16 16:49:37.048372 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:37.048307 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751"} err="failed to get container status \"68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751\": rpc error: code = NotFound desc = could not find container \"68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751\": container with ID starting with 68edcd3f53c00d478a62dec639076b2d395094db2a723b459490f7205f50e751 not found: ID does not exist" Apr 16 16:49:37.065651 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:37.065629 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b8c6f94d8-cghq8"] Apr 16 16:49:37.072711 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:37.072685 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-b8c6f94d8-cghq8"] Apr 16 16:49:37.590755 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:37.590718 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" path="/var/lib/kubelet/pods/fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4/volumes" Apr 16 16:49:41.291044 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:41.291008 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:41.291433 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:41.291108 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:41.295388 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:41.295369 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:42.057578 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:42.057550 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:49:43.949435 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:43.949409 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-s7xbf" Apr 16 16:49:54.029709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.029654 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-576b986fc4-xfn7p" podUID="4bb1ed34-b734-410e-b2ba-452f8604f2ef" containerName="console" containerID="cri-o://c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c" gracePeriod=15 Apr 16 16:49:54.282720 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.282672 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-576b986fc4-xfn7p_4bb1ed34-b734-410e-b2ba-452f8604f2ef/console/0.log" Apr 16 16:49:54.282808 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.282730 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:54.322349 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.322326 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-serving-cert\") pod \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " Apr 16 16:49:54.322534 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.322363 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-oauth-config\") pod \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " Apr 16 16:49:54.322534 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.322422 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-oauth-serving-cert\") pod \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " Apr 16 16:49:54.322534 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.322513 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxjj8\" (UniqueName: \"kubernetes.io/projected/4bb1ed34-b734-410e-b2ba-452f8604f2ef-kube-api-access-rxjj8\") pod \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " Apr 16 16:49:54.322705 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.322565 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-service-ca\") pod \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " Apr 16 16:49:54.322705 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.322588 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-config\") pod \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\" (UID: \"4bb1ed34-b734-410e-b2ba-452f8604f2ef\") " Apr 16 16:49:54.322836 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.322813 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4bb1ed34-b734-410e-b2ba-452f8604f2ef" (UID: "4bb1ed34-b734-410e-b2ba-452f8604f2ef"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:49:54.323122 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.323100 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-config" (OuterVolumeSpecName: "console-config") pod "4bb1ed34-b734-410e-b2ba-452f8604f2ef" (UID: "4bb1ed34-b734-410e-b2ba-452f8604f2ef"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:49:54.323219 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.323097 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-service-ca" (OuterVolumeSpecName: "service-ca") pod "4bb1ed34-b734-410e-b2ba-452f8604f2ef" (UID: "4bb1ed34-b734-410e-b2ba-452f8604f2ef"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:49:54.324587 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.324560 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4bb1ed34-b734-410e-b2ba-452f8604f2ef" (UID: "4bb1ed34-b734-410e-b2ba-452f8604f2ef"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:49:54.324682 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.324588 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb1ed34-b734-410e-b2ba-452f8604f2ef-kube-api-access-rxjj8" (OuterVolumeSpecName: "kube-api-access-rxjj8") pod "4bb1ed34-b734-410e-b2ba-452f8604f2ef" (UID: "4bb1ed34-b734-410e-b2ba-452f8604f2ef"). InnerVolumeSpecName "kube-api-access-rxjj8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:49:54.324682 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.324598 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4bb1ed34-b734-410e-b2ba-452f8604f2ef" (UID: "4bb1ed34-b734-410e-b2ba-452f8604f2ef"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:49:54.423306 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.423284 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-serving-cert\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:54.423306 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.423304 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-oauth-config\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:54.423465 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.423313 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-oauth-serving-cert\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:54.423465 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.423322 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rxjj8\" (UniqueName: \"kubernetes.io/projected/4bb1ed34-b734-410e-b2ba-452f8604f2ef-kube-api-access-rxjj8\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:54.423465 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.423331 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-service-ca\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:54.423465 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:54.423340 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4bb1ed34-b734-410e-b2ba-452f8604f2ef-console-config\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:49:55.088199 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:55.088172 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-576b986fc4-xfn7p_4bb1ed34-b734-410e-b2ba-452f8604f2ef/console/0.log" Apr 16 16:49:55.088578 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:55.088209 2572 generic.go:358] "Generic (PLEG): container finished" podID="4bb1ed34-b734-410e-b2ba-452f8604f2ef" containerID="c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c" exitCode=2 Apr 16 16:49:55.088578 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:55.088279 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576b986fc4-xfn7p" Apr 16 16:49:55.088578 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:55.088295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576b986fc4-xfn7p" event={"ID":"4bb1ed34-b734-410e-b2ba-452f8604f2ef","Type":"ContainerDied","Data":"c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c"} Apr 16 16:49:55.088578 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:55.088325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576b986fc4-xfn7p" event={"ID":"4bb1ed34-b734-410e-b2ba-452f8604f2ef","Type":"ContainerDied","Data":"bf96e6529b55cb0d8b1463ef119fd2edbe24f10658258c13dd5bf5fc6385a784"} Apr 16 16:49:55.088578 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:55.088341 2572 scope.go:117] "RemoveContainer" containerID="c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c" Apr 16 16:49:55.095775 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:55.095536 2572 scope.go:117] "RemoveContainer" containerID="c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c" Apr 16 16:49:55.095901 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:49:55.095870 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c\": container with ID starting with c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c not found: ID does not exist" containerID="c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c" Apr 16 16:49:55.095976 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:55.095905 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c"} err="failed to get container status \"c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c\": rpc error: code = NotFound desc = could not find container \"c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c\": container with ID starting with c5f9647de732088c292f2210842e87b8848daab98ae645811647201f2504465c not found: ID does not exist" Apr 16 16:49:55.118417 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:55.118389 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-576b986fc4-xfn7p"] Apr 16 16:49:55.122502 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:55.122484 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-576b986fc4-xfn7p"] Apr 16 16:49:55.589619 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:49:55.589585 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb1ed34-b734-410e-b2ba-452f8604f2ef" path="/var/lib/kubelet/pods/4bb1ed34-b734-410e-b2ba-452f8604f2ef/volumes" Apr 16 16:50:20.573662 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.573622 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c4ff98cb8-l9wgn"] Apr 16 16:50:20.574231 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.573875 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" containerName="registry" Apr 16 16:50:20.574231 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.573885 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" containerName="registry" Apr 16 16:50:20.574231 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.573901 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bb1ed34-b734-410e-b2ba-452f8604f2ef" containerName="console" Apr 16 16:50:20.574231 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.573906 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb1ed34-b734-410e-b2ba-452f8604f2ef" containerName="console" Apr 16 16:50:20.574231 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.573946 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa4c43d0-f045-46fe-9f44-2a2fb4a7e9d4" containerName="registry" Apr 16 16:50:20.574231 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.573961 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bb1ed34-b734-410e-b2ba-452f8604f2ef" containerName="console" Apr 16 16:50:20.576531 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.576508 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.589484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.589462 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c4ff98cb8-l9wgn"] Apr 16 16:50:20.715101 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.715057 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mthx8\" (UniqueName: \"kubernetes.io/projected/9a328696-e46f-4e4f-9ae0-416fa37377cd-kube-api-access-mthx8\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.715248 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.715118 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-config\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.715248 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.715146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-trusted-ca-bundle\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.715248 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.715170 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-serving-cert\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.715248 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.715200 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-service-ca\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.715402 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.715270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-oauth-serving-cert\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.715402 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.715304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-oauth-config\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.815756 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.815731 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-oauth-config\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.815868 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.815770 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mthx8\" (UniqueName: \"kubernetes.io/projected/9a328696-e46f-4e4f-9ae0-416fa37377cd-kube-api-access-mthx8\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.815868 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.815795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-config\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.815868 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.815813 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-trusted-ca-bundle\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.815868 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.815835 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-serving-cert\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.815868 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.815853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-service-ca\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.816132 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.815877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-oauth-serving-cert\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.816566 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.816547 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-service-ca\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.816718 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.816694 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-oauth-serving-cert\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.816785 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.816694 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-config\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.816915 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.816890 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-trusted-ca-bundle\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.818181 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.818160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-oauth-config\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.818363 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.818344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-serving-cert\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.823866 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.823817 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mthx8\" (UniqueName: \"kubernetes.io/projected/9a328696-e46f-4e4f-9ae0-416fa37377cd-kube-api-access-mthx8\") pod \"console-7c4ff98cb8-l9wgn\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:20.884900 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:20.884881 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:21.010935 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:21.010905 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c4ff98cb8-l9wgn"] Apr 16 16:50:21.014705 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:50:21.014668 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a328696_e46f_4e4f_9ae0_416fa37377cd.slice/crio-06d9e22f8ea6f9d638993b9ad76165dca3fd3a3158ccb2db21f1e3c113811d1f WatchSource:0}: Error finding container 06d9e22f8ea6f9d638993b9ad76165dca3fd3a3158ccb2db21f1e3c113811d1f: Status 404 returned error can't find the container with id 06d9e22f8ea6f9d638993b9ad76165dca3fd3a3158ccb2db21f1e3c113811d1f Apr 16 16:50:21.154353 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:21.154319 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c4ff98cb8-l9wgn" event={"ID":"9a328696-e46f-4e4f-9ae0-416fa37377cd","Type":"ContainerStarted","Data":"5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78"} Apr 16 16:50:21.154516 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:21.154361 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c4ff98cb8-l9wgn" event={"ID":"9a328696-e46f-4e4f-9ae0-416fa37377cd","Type":"ContainerStarted","Data":"06d9e22f8ea6f9d638993b9ad76165dca3fd3a3158ccb2db21f1e3c113811d1f"} Apr 16 16:50:21.173903 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:21.173861 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c4ff98cb8-l9wgn" podStartSLOduration=1.173848532 podStartE2EDuration="1.173848532s" podCreationTimestamp="2026-04-16 16:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:50:21.173212935 +0000 UTC m=+138.341924298" watchObservedRunningTime="2026-04-16 16:50:21.173848532 +0000 UTC m=+138.342559894" Apr 16 16:50:30.885552 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:30.885500 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:30.886048 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:30.885574 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:30.890158 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:30.890138 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:31.185433 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:31.185351 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:50:31.231611 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:31.231584 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5798d64495-6g9hk"] Apr 16 16:50:56.249512 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.249475 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5798d64495-6g9hk" podUID="b1400945-53f0-41e1-826d-2bab62fe3a95" containerName="console" containerID="cri-o://28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783" gracePeriod=15 Apr 16 16:50:56.477367 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.477344 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5798d64495-6g9hk_b1400945-53f0-41e1-826d-2bab62fe3a95/console/0.log" Apr 16 16:50:56.477463 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.477406 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:50:56.576496 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.576423 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-console-config\") pod \"b1400945-53f0-41e1-826d-2bab62fe3a95\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " Apr 16 16:50:56.576496 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.576472 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-trusted-ca-bundle\") pod \"b1400945-53f0-41e1-826d-2bab62fe3a95\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " Apr 16 16:50:56.576685 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.576500 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-oauth-serving-cert\") pod \"b1400945-53f0-41e1-826d-2bab62fe3a95\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " Apr 16 16:50:56.576685 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.576625 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxhr2\" (UniqueName: \"kubernetes.io/projected/b1400945-53f0-41e1-826d-2bab62fe3a95-kube-api-access-jxhr2\") pod \"b1400945-53f0-41e1-826d-2bab62fe3a95\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " Apr 16 16:50:56.576790 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.576696 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-oauth-config\") pod \"b1400945-53f0-41e1-826d-2bab62fe3a95\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " Apr 16 16:50:56.576790 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.576731 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-serving-cert\") pod \"b1400945-53f0-41e1-826d-2bab62fe3a95\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " Apr 16 16:50:56.576790 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.576770 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-service-ca\") pod \"b1400945-53f0-41e1-826d-2bab62fe3a95\" (UID: \"b1400945-53f0-41e1-826d-2bab62fe3a95\") " Apr 16 16:50:56.576790 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.576777 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b1400945-53f0-41e1-826d-2bab62fe3a95" (UID: "b1400945-53f0-41e1-826d-2bab62fe3a95"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:56.576976 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.576852 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-console-config" (OuterVolumeSpecName: "console-config") pod "b1400945-53f0-41e1-826d-2bab62fe3a95" (UID: "b1400945-53f0-41e1-826d-2bab62fe3a95"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:56.576976 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.576959 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-console-config\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:50:56.576976 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.576971 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-oauth-serving-cert\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:50:56.577189 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.577167 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b1400945-53f0-41e1-826d-2bab62fe3a95" (UID: "b1400945-53f0-41e1-826d-2bab62fe3a95"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:56.577246 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.577206 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-service-ca" (OuterVolumeSpecName: "service-ca") pod "b1400945-53f0-41e1-826d-2bab62fe3a95" (UID: "b1400945-53f0-41e1-826d-2bab62fe3a95"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:56.578716 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.578693 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b1400945-53f0-41e1-826d-2bab62fe3a95" (UID: "b1400945-53f0-41e1-826d-2bab62fe3a95"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:56.579241 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.579211 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b1400945-53f0-41e1-826d-2bab62fe3a95" (UID: "b1400945-53f0-41e1-826d-2bab62fe3a95"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:56.579329 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.579219 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1400945-53f0-41e1-826d-2bab62fe3a95-kube-api-access-jxhr2" (OuterVolumeSpecName: "kube-api-access-jxhr2") pod "b1400945-53f0-41e1-826d-2bab62fe3a95" (UID: "b1400945-53f0-41e1-826d-2bab62fe3a95"). InnerVolumeSpecName "kube-api-access-jxhr2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:50:56.678203 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.678167 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-service-ca\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:50:56.678203 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.678203 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1400945-53f0-41e1-826d-2bab62fe3a95-trusted-ca-bundle\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:50:56.678362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.678218 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jxhr2\" (UniqueName: \"kubernetes.io/projected/b1400945-53f0-41e1-826d-2bab62fe3a95-kube-api-access-jxhr2\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:50:56.678362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.678233 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-oauth-config\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:50:56.678362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:56.678244 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1400945-53f0-41e1-826d-2bab62fe3a95-console-serving-cert\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:50:57.249319 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:57.249292 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5798d64495-6g9hk_b1400945-53f0-41e1-826d-2bab62fe3a95/console/0.log" Apr 16 16:50:57.249486 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:57.249333 2572 generic.go:358] "Generic (PLEG): container finished" podID="b1400945-53f0-41e1-826d-2bab62fe3a95" containerID="28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783" exitCode=2 Apr 16 16:50:57.249486 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:57.249406 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5798d64495-6g9hk" Apr 16 16:50:57.249486 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:57.249426 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5798d64495-6g9hk" event={"ID":"b1400945-53f0-41e1-826d-2bab62fe3a95","Type":"ContainerDied","Data":"28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783"} Apr 16 16:50:57.249486 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:57.249468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5798d64495-6g9hk" event={"ID":"b1400945-53f0-41e1-826d-2bab62fe3a95","Type":"ContainerDied","Data":"312776df46b7084e75eb094ebf573a5624d64c0f22b5bb83afa9fb8ac7a5d5f2"} Apr 16 16:50:57.249486 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:57.249483 2572 scope.go:117] "RemoveContainer" containerID="28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783" Apr 16 16:50:57.256971 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:57.256800 2572 scope.go:117] "RemoveContainer" containerID="28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783" Apr 16 16:50:57.257219 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:50:57.257200 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783\": container with ID starting with 28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783 not found: ID does not exist" containerID="28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783" Apr 16 16:50:57.257276 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:57.257228 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783"} err="failed to get container status \"28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783\": rpc error: code = NotFound desc = could not find container \"28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783\": container with ID starting with 28e1927c922c71da7f834e47e763b9b6305d23610b43439c7a03e88418406783 not found: ID does not exist" Apr 16 16:50:57.269802 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:57.269777 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5798d64495-6g9hk"] Apr 16 16:50:57.273033 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:57.273013 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5798d64495-6g9hk"] Apr 16 16:50:57.589415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:50:57.589332 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1400945-53f0-41e1-826d-2bab62fe3a95" path="/var/lib/kubelet/pods/b1400945-53f0-41e1-826d-2bab62fe3a95/volumes" Apr 16 16:51:13.001260 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.001228 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6"] Apr 16 16:51:13.001688 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.001476 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1400945-53f0-41e1-826d-2bab62fe3a95" containerName="console" Apr 16 16:51:13.001688 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.001485 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1400945-53f0-41e1-826d-2bab62fe3a95" containerName="console" Apr 16 16:51:13.003911 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.002029 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1400945-53f0-41e1-826d-2bab62fe3a95" containerName="console" Apr 16 16:51:13.007750 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.007725 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:13.010478 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.010460 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:51:13.011876 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.011850 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj6nm\"" Apr 16 16:51:13.011983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.011855 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:51:13.013633 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.013611 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6"] Apr 16 16:51:13.099982 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.099945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-252jc\" (UniqueName: \"kubernetes.io/projected/b6659625-194e-49f9-b216-4256d70d3690-kube-api-access-252jc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:13.100173 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.100047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:13.100173 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.100097 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:13.201103 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.201049 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:13.201103 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.201106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:13.201307 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.201129 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-252jc\" (UniqueName: \"kubernetes.io/projected/b6659625-194e-49f9-b216-4256d70d3690-kube-api-access-252jc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:13.201487 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.201464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:13.201546 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.201495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:13.209979 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.209945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-252jc\" (UniqueName: \"kubernetes.io/projected/b6659625-194e-49f9-b216-4256d70d3690-kube-api-access-252jc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:13.316915 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.316819 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:13.430760 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:13.430622 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6"] Apr 16 16:51:13.433344 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:51:13.433315 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6659625_194e_49f9_b216_4256d70d3690.slice/crio-283e01d85ca1e69e697b0aab82ddfaed5cbbd0891fd77392e49d20fb0cabb63d WatchSource:0}: Error finding container 283e01d85ca1e69e697b0aab82ddfaed5cbbd0891fd77392e49d20fb0cabb63d: Status 404 returned error can't find the container with id 283e01d85ca1e69e697b0aab82ddfaed5cbbd0891fd77392e49d20fb0cabb63d Apr 16 16:51:14.292977 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:14.292935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" event={"ID":"b6659625-194e-49f9-b216-4256d70d3690","Type":"ContainerStarted","Data":"283e01d85ca1e69e697b0aab82ddfaed5cbbd0891fd77392e49d20fb0cabb63d"} Apr 16 16:51:18.305006 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:18.304919 2572 generic.go:358] "Generic (PLEG): container finished" podID="b6659625-194e-49f9-b216-4256d70d3690" containerID="64d831da4fc59c4cf80196af36c261089ecd7fa56267c58504a3be2bd22aa280" exitCode=0 Apr 16 16:51:18.305006 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:18.304993 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" event={"ID":"b6659625-194e-49f9-b216-4256d70d3690","Type":"ContainerDied","Data":"64d831da4fc59c4cf80196af36c261089ecd7fa56267c58504a3be2bd22aa280"} Apr 16 16:51:21.314804 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:21.314730 2572 generic.go:358] "Generic (PLEG): container finished" podID="b6659625-194e-49f9-b216-4256d70d3690" containerID="f555c82665618053d6c3f3f2724acef0ee9b85ef6666953bc440b4238999994a" exitCode=0 Apr 16 16:51:21.314804 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:21.314770 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" event={"ID":"b6659625-194e-49f9-b216-4256d70d3690","Type":"ContainerDied","Data":"f555c82665618053d6c3f3f2724acef0ee9b85ef6666953bc440b4238999994a"} Apr 16 16:51:29.336973 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:29.336930 2572 generic.go:358] "Generic (PLEG): container finished" podID="b6659625-194e-49f9-b216-4256d70d3690" containerID="f371c751416f87dd75ca6484d60cbf5467a8136e73c9ceb1e8b5f503902b686c" exitCode=0 Apr 16 16:51:29.337340 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:29.337007 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" event={"ID":"b6659625-194e-49f9-b216-4256d70d3690","Type":"ContainerDied","Data":"f371c751416f87dd75ca6484d60cbf5467a8136e73c9ceb1e8b5f503902b686c"} Apr 16 16:51:30.456508 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:30.456484 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:30.538603 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:30.538576 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-252jc\" (UniqueName: \"kubernetes.io/projected/b6659625-194e-49f9-b216-4256d70d3690-kube-api-access-252jc\") pod \"b6659625-194e-49f9-b216-4256d70d3690\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " Apr 16 16:51:30.538708 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:30.538606 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-util\") pod \"b6659625-194e-49f9-b216-4256d70d3690\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " Apr 16 16:51:30.538708 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:30.538633 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-bundle\") pod \"b6659625-194e-49f9-b216-4256d70d3690\" (UID: \"b6659625-194e-49f9-b216-4256d70d3690\") " Apr 16 16:51:30.539193 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:30.539165 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-bundle" (OuterVolumeSpecName: "bundle") pod "b6659625-194e-49f9-b216-4256d70d3690" (UID: "b6659625-194e-49f9-b216-4256d70d3690"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:30.540623 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:30.540602 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6659625-194e-49f9-b216-4256d70d3690-kube-api-access-252jc" (OuterVolumeSpecName: "kube-api-access-252jc") pod "b6659625-194e-49f9-b216-4256d70d3690" (UID: "b6659625-194e-49f9-b216-4256d70d3690"). InnerVolumeSpecName "kube-api-access-252jc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:51:30.543128 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:30.543099 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-util" (OuterVolumeSpecName: "util") pod "b6659625-194e-49f9-b216-4256d70d3690" (UID: "b6659625-194e-49f9-b216-4256d70d3690"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:30.639336 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:30.639315 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-252jc\" (UniqueName: \"kubernetes.io/projected/b6659625-194e-49f9-b216-4256d70d3690-kube-api-access-252jc\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:51:30.639414 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:30.639337 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-util\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:51:30.639414 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:30.639349 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6659625-194e-49f9-b216-4256d70d3690-bundle\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:51:31.343781 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:31.343741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" event={"ID":"b6659625-194e-49f9-b216-4256d70d3690","Type":"ContainerDied","Data":"283e01d85ca1e69e697b0aab82ddfaed5cbbd0891fd77392e49d20fb0cabb63d"} Apr 16 16:51:31.343781 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:31.343778 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="283e01d85ca1e69e697b0aab82ddfaed5cbbd0891fd77392e49d20fb0cabb63d" Apr 16 16:51:31.344003 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:31.343804 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmnbs6" Apr 16 16:51:34.623374 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.623346 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf"] Apr 16 16:51:34.623765 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.623610 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6659625-194e-49f9-b216-4256d70d3690" containerName="util" Apr 16 16:51:34.623765 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.623621 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6659625-194e-49f9-b216-4256d70d3690" containerName="util" Apr 16 16:51:34.623765 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.623628 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6659625-194e-49f9-b216-4256d70d3690" containerName="pull" Apr 16 16:51:34.623765 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.623633 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6659625-194e-49f9-b216-4256d70d3690" containerName="pull" Apr 16 16:51:34.623765 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.623645 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6659625-194e-49f9-b216-4256d70d3690" containerName="extract" Apr 16 16:51:34.623765 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.623654 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6659625-194e-49f9-b216-4256d70d3690" containerName="extract" Apr 16 16:51:34.623765 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.623700 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6659625-194e-49f9-b216-4256d70d3690" containerName="extract" Apr 16 16:51:34.650049 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.650025 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf"] Apr 16 16:51:34.650195 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.650142 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" Apr 16 16:51:34.652791 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.652771 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 16:51:34.652923 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.652788 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-jg5lg\"" Apr 16 16:51:34.652923 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.652905 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 16:51:34.653037 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.653024 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 16:51:34.770484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.770460 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk9p8\" (UniqueName: \"kubernetes.io/projected/382c6f8e-06d8-45a0-a719-1087ab9029bf-kube-api-access-gk9p8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf\" (UID: \"382c6f8e-06d8-45a0-a719-1087ab9029bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" Apr 16 16:51:34.770587 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.770524 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/382c6f8e-06d8-45a0-a719-1087ab9029bf-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf\" (UID: \"382c6f8e-06d8-45a0-a719-1087ab9029bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" Apr 16 16:51:34.871473 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.871448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/382c6f8e-06d8-45a0-a719-1087ab9029bf-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf\" (UID: \"382c6f8e-06d8-45a0-a719-1087ab9029bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" Apr 16 16:51:34.871616 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.871488 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk9p8\" (UniqueName: \"kubernetes.io/projected/382c6f8e-06d8-45a0-a719-1087ab9029bf-kube-api-access-gk9p8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf\" (UID: \"382c6f8e-06d8-45a0-a719-1087ab9029bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" Apr 16 16:51:34.873862 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.873816 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/382c6f8e-06d8-45a0-a719-1087ab9029bf-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf\" (UID: \"382c6f8e-06d8-45a0-a719-1087ab9029bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" Apr 16 16:51:34.886924 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.886899 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk9p8\" (UniqueName: \"kubernetes.io/projected/382c6f8e-06d8-45a0-a719-1087ab9029bf-kube-api-access-gk9p8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf\" (UID: \"382c6f8e-06d8-45a0-a719-1087ab9029bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" Apr 16 16:51:34.964234 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:34.964214 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" Apr 16 16:51:35.081917 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:35.081803 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf"] Apr 16 16:51:35.084723 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:51:35.084698 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod382c6f8e_06d8_45a0_a719_1087ab9029bf.slice/crio-cfbea020c0a87cf5638da88dd02e4b30d2b71c2c98d53837ccbcf67854750d0e WatchSource:0}: Error finding container cfbea020c0a87cf5638da88dd02e4b30d2b71c2c98d53837ccbcf67854750d0e: Status 404 returned error can't find the container with id cfbea020c0a87cf5638da88dd02e4b30d2b71c2c98d53837ccbcf67854750d0e Apr 16 16:51:35.354244 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:35.354178 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" event={"ID":"382c6f8e-06d8-45a0-a719-1087ab9029bf","Type":"ContainerStarted","Data":"cfbea020c0a87cf5638da88dd02e4b30d2b71c2c98d53837ccbcf67854750d0e"} Apr 16 16:51:39.366383 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.366304 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" event={"ID":"382c6f8e-06d8-45a0-a719-1087ab9029bf","Type":"ContainerStarted","Data":"58679dd2b945bbe5fdbca989f18022dac91dcf7b3ef88eaa69f42cff472cfdb4"} Apr 16 16:51:39.366710 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.366427 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" Apr 16 16:51:39.404623 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.404583 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" podStartSLOduration=1.581693831 podStartE2EDuration="5.404570887s" podCreationTimestamp="2026-04-16 16:51:34 +0000 UTC" firstStartedPulling="2026-04-16 16:51:35.086516681 +0000 UTC m=+212.255228023" lastFinishedPulling="2026-04-16 16:51:38.909393734 +0000 UTC m=+216.078105079" observedRunningTime="2026-04-16 16:51:39.403274562 +0000 UTC m=+216.571985925" watchObservedRunningTime="2026-04-16 16:51:39.404570887 +0000 UTC m=+216.573282249" Apr 16 16:51:39.481592 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.481558 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-t6tng"] Apr 16 16:51:39.484853 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.484829 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:39.487611 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.487591 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 16:51:39.487704 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.487630 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-jkkld\"" Apr 16 16:51:39.487988 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.487969 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 16:51:39.493747 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.493729 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-t6tng"] Apr 16 16:51:39.607293 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.607266 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh848\" (UniqueName: \"kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-kube-api-access-sh848\") pod \"keda-operator-ffbb595cb-t6tng\" (UID: \"f1b2d401-79df-4052-8215-223ea1c2a0a5\") " pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:39.607431 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.607303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-certificates\") pod \"keda-operator-ffbb595cb-t6tng\" (UID: \"f1b2d401-79df-4052-8215-223ea1c2a0a5\") " pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:39.607431 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.607371 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f1b2d401-79df-4052-8215-223ea1c2a0a5-cabundle0\") pod \"keda-operator-ffbb595cb-t6tng\" (UID: \"f1b2d401-79df-4052-8215-223ea1c2a0a5\") " pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:39.708694 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.708661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-certificates\") pod \"keda-operator-ffbb595cb-t6tng\" (UID: \"f1b2d401-79df-4052-8215-223ea1c2a0a5\") " pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:39.708835 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.708716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f1b2d401-79df-4052-8215-223ea1c2a0a5-cabundle0\") pod \"keda-operator-ffbb595cb-t6tng\" (UID: \"f1b2d401-79df-4052-8215-223ea1c2a0a5\") " pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:39.708906 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:39.708887 2572 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:51:39.708940 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:39.708912 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:51:39.708940 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:39.708925 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-t6tng: references non-existent secret key: ca.crt Apr 16 16:51:39.708998 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:39.708986 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-certificates podName:f1b2d401-79df-4052-8215-223ea1c2a0a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:51:40.208966317 +0000 UTC m=+217.377677662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-certificates") pod "keda-operator-ffbb595cb-t6tng" (UID: "f1b2d401-79df-4052-8215-223ea1c2a0a5") : references non-existent secret key: ca.crt Apr 16 16:51:39.709127 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.709097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sh848\" (UniqueName: \"kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-kube-api-access-sh848\") pod \"keda-operator-ffbb595cb-t6tng\" (UID: \"f1b2d401-79df-4052-8215-223ea1c2a0a5\") " pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:39.709486 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.709466 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f1b2d401-79df-4052-8215-223ea1c2a0a5-cabundle0\") pod \"keda-operator-ffbb595cb-t6tng\" (UID: \"f1b2d401-79df-4052-8215-223ea1c2a0a5\") " pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:39.717555 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.717535 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh848\" (UniqueName: \"kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-kube-api-access-sh848\") pod \"keda-operator-ffbb595cb-t6tng\" (UID: \"f1b2d401-79df-4052-8215-223ea1c2a0a5\") " pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:39.839165 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.839129 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6d642"] Apr 16 16:51:39.842468 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.842452 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:39.844946 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.844922 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 16:51:39.849836 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.849809 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6d642"] Apr 16 16:51:39.910833 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.910799 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6d642\" (UID: \"1abd6ff1-83b8-476b-986b-6cf27b7b7da0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:39.910985 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.910844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkx4x\" (UniqueName: \"kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-kube-api-access-wkx4x\") pod \"keda-metrics-apiserver-7c9f485588-6d642\" (UID: \"1abd6ff1-83b8-476b-986b-6cf27b7b7da0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:39.910985 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:39.910948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6d642\" (UID: \"1abd6ff1-83b8-476b-986b-6cf27b7b7da0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:40.011727 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.011655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6d642\" (UID: \"1abd6ff1-83b8-476b-986b-6cf27b7b7da0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:40.011727 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.011724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6d642\" (UID: \"1abd6ff1-83b8-476b-986b-6cf27b7b7da0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:40.011949 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.011746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkx4x\" (UniqueName: \"kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-kube-api-access-wkx4x\") pod \"keda-metrics-apiserver-7c9f485588-6d642\" (UID: \"1abd6ff1-83b8-476b-986b-6cf27b7b7da0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:40.011949 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.011814 2572 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:51:40.011949 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.011836 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:51:40.011949 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.011856 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6d642: references non-existent secret key: tls.crt Apr 16 16:51:40.011949 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.011918 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-certificates podName:1abd6ff1-83b8-476b-986b-6cf27b7b7da0 nodeName:}" failed. No retries permitted until 2026-04-16 16:51:40.511897213 +0000 UTC m=+217.680608573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-certificates") pod "keda-metrics-apiserver-7c9f485588-6d642" (UID: "1abd6ff1-83b8-476b-986b-6cf27b7b7da0") : references non-existent secret key: tls.crt Apr 16 16:51:40.012201 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.012093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6d642\" (UID: \"1abd6ff1-83b8-476b-986b-6cf27b7b7da0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:40.026470 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.026440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkx4x\" (UniqueName: \"kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-kube-api-access-wkx4x\") pod \"keda-metrics-apiserver-7c9f485588-6d642\" (UID: \"1abd6ff1-83b8-476b-986b-6cf27b7b7da0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:40.037022 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.036993 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-jsp5q"] Apr 16 16:51:40.040191 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.040177 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:51:40.043167 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.043149 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 16:51:40.054082 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.054044 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-jsp5q"] Apr 16 16:51:40.112570 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.112543 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-certificates\") pod \"keda-admission-cf49989db-jsp5q\" (UID: \"26d265df-f81b-49fe-a2d6-f75b7e93b923\") " pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:51:40.112712 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.112610 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9rq2\" (UniqueName: \"kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-kube-api-access-g9rq2\") pod \"keda-admission-cf49989db-jsp5q\" (UID: \"26d265df-f81b-49fe-a2d6-f75b7e93b923\") " pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:51:40.213372 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.213330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9rq2\" (UniqueName: \"kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-kube-api-access-g9rq2\") pod \"keda-admission-cf49989db-jsp5q\" (UID: \"26d265df-f81b-49fe-a2d6-f75b7e93b923\") " pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:51:40.213561 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.213393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-certificates\") pod \"keda-operator-ffbb595cb-t6tng\" (UID: \"f1b2d401-79df-4052-8215-223ea1c2a0a5\") " pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:40.213561 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.213450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-certificates\") pod \"keda-admission-cf49989db-jsp5q\" (UID: \"26d265df-f81b-49fe-a2d6-f75b7e93b923\") " pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:51:40.213561 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.213536 2572 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:51:40.213561 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.213557 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:51:40.213561 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.213562 2572 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 16:51:40.213778 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.213569 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-t6tng: references non-existent secret key: ca.crt Apr 16 16:51:40.213778 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.213581 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-jsp5q: secret "keda-admission-webhooks-certs" not found Apr 16 16:51:40.213778 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.213630 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-certificates podName:26d265df-f81b-49fe-a2d6-f75b7e93b923 nodeName:}" failed. No retries permitted until 2026-04-16 16:51:40.713612773 +0000 UTC m=+217.882324118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-certificates") pod "keda-admission-cf49989db-jsp5q" (UID: "26d265df-f81b-49fe-a2d6-f75b7e93b923") : secret "keda-admission-webhooks-certs" not found Apr 16 16:51:40.213778 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.213646 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-certificates podName:f1b2d401-79df-4052-8215-223ea1c2a0a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:51:41.213637418 +0000 UTC m=+218.382348768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-certificates") pod "keda-operator-ffbb595cb-t6tng" (UID: "f1b2d401-79df-4052-8215-223ea1c2a0a5") : references non-existent secret key: ca.crt Apr 16 16:51:40.222352 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.222328 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9rq2\" (UniqueName: \"kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-kube-api-access-g9rq2\") pod \"keda-admission-cf49989db-jsp5q\" (UID: \"26d265df-f81b-49fe-a2d6-f75b7e93b923\") " pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:51:40.515983 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.515950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6d642\" (UID: \"1abd6ff1-83b8-476b-986b-6cf27b7b7da0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:40.516379 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.516125 2572 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:51:40.516379 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.516146 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:51:40.516379 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.516166 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6d642: references non-existent secret key: tls.crt Apr 16 16:51:40.516379 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.516233 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-certificates podName:1abd6ff1-83b8-476b-986b-6cf27b7b7da0 nodeName:}" failed. No retries permitted until 2026-04-16 16:51:41.51621269 +0000 UTC m=+218.684924057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-certificates") pod "keda-metrics-apiserver-7c9f485588-6d642" (UID: "1abd6ff1-83b8-476b-986b-6cf27b7b7da0") : references non-existent secret key: tls.crt Apr 16 16:51:40.717469 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:40.717438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-certificates\") pod \"keda-admission-cf49989db-jsp5q\" (UID: \"26d265df-f81b-49fe-a2d6-f75b7e93b923\") " pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:51:40.717625 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.717584 2572 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 16:51:40.717625 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.717608 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-jsp5q: secret "keda-admission-webhooks-certs" not found Apr 16 16:51:40.717699 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:51:40.717665 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-certificates podName:26d265df-f81b-49fe-a2d6-f75b7e93b923 nodeName:}" failed. No retries permitted until 2026-04-16 16:51:41.717645144 +0000 UTC m=+218.886356488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-certificates") pod "keda-admission-cf49989db-jsp5q" (UID: "26d265df-f81b-49fe-a2d6-f75b7e93b923") : secret "keda-admission-webhooks-certs" not found Apr 16 16:51:41.220657 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.220619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-certificates\") pod \"keda-operator-ffbb595cb-t6tng\" (UID: \"f1b2d401-79df-4052-8215-223ea1c2a0a5\") " pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:41.223019 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.223000 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1b2d401-79df-4052-8215-223ea1c2a0a5-certificates\") pod \"keda-operator-ffbb595cb-t6tng\" (UID: \"f1b2d401-79df-4052-8215-223ea1c2a0a5\") " pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:41.295160 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.295132 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:41.410330 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.410300 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-t6tng"] Apr 16 16:51:41.413125 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:51:41.413093 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1b2d401_79df_4052_8215_223ea1c2a0a5.slice/crio-e342de5212024fc1828335e5da944b9ed7375fc616c32dd9588abf7b8800d410 WatchSource:0}: Error finding container e342de5212024fc1828335e5da944b9ed7375fc616c32dd9588abf7b8800d410: Status 404 returned error can't find the container with id e342de5212024fc1828335e5da944b9ed7375fc616c32dd9588abf7b8800d410 Apr 16 16:51:41.522957 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.522877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6d642\" (UID: \"1abd6ff1-83b8-476b-986b-6cf27b7b7da0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:41.525214 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.525196 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1abd6ff1-83b8-476b-986b-6cf27b7b7da0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6d642\" (UID: \"1abd6ff1-83b8-476b-986b-6cf27b7b7da0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:41.653132 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.653105 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:41.724601 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.724568 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-certificates\") pod \"keda-admission-cf49989db-jsp5q\" (UID: \"26d265df-f81b-49fe-a2d6-f75b7e93b923\") " pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:51:41.727020 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.726991 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26d265df-f81b-49fe-a2d6-f75b7e93b923-certificates\") pod \"keda-admission-cf49989db-jsp5q\" (UID: \"26d265df-f81b-49fe-a2d6-f75b7e93b923\") " pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:51:41.764598 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.764543 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6d642"] Apr 16 16:51:41.766958 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:51:41.766934 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1abd6ff1_83b8_476b_986b_6cf27b7b7da0.slice/crio-9141ef36a9da4dc67d5a1314cfe96a05cf228972e36bdc2f87d118b5cf94115c WatchSource:0}: Error finding container 9141ef36a9da4dc67d5a1314cfe96a05cf228972e36bdc2f87d118b5cf94115c: Status 404 returned error can't find the container with id 9141ef36a9da4dc67d5a1314cfe96a05cf228972e36bdc2f87d118b5cf94115c Apr 16 16:51:41.849913 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.849843 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:51:41.963120 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:41.963094 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-jsp5q"] Apr 16 16:51:41.965234 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:51:41.965209 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26d265df_f81b_49fe_a2d6_f75b7e93b923.slice/crio-e7f131ea1c22c663e83ef60c19535021cb0a1960b56207ce7cba380e1dc225e5 WatchSource:0}: Error finding container e7f131ea1c22c663e83ef60c19535021cb0a1960b56207ce7cba380e1dc225e5: Status 404 returned error can't find the container with id e7f131ea1c22c663e83ef60c19535021cb0a1960b56207ce7cba380e1dc225e5 Apr 16 16:51:42.378531 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:42.378493 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" event={"ID":"1abd6ff1-83b8-476b-986b-6cf27b7b7da0","Type":"ContainerStarted","Data":"9141ef36a9da4dc67d5a1314cfe96a05cf228972e36bdc2f87d118b5cf94115c"} Apr 16 16:51:42.379429 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:42.379408 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-jsp5q" event={"ID":"26d265df-f81b-49fe-a2d6-f75b7e93b923","Type":"ContainerStarted","Data":"e7f131ea1c22c663e83ef60c19535021cb0a1960b56207ce7cba380e1dc225e5"} Apr 16 16:51:42.380297 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:42.380278 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-t6tng" event={"ID":"f1b2d401-79df-4052-8215-223ea1c2a0a5","Type":"ContainerStarted","Data":"e342de5212024fc1828335e5da944b9ed7375fc616c32dd9588abf7b8800d410"} Apr 16 16:51:44.390227 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:44.390173 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-jsp5q" event={"ID":"26d265df-f81b-49fe-a2d6-f75b7e93b923","Type":"ContainerStarted","Data":"ce9b91b93aa8cfdc7f0b98a93228aa59f0dccc2a63538292bfb1ae1aaa297126"} Apr 16 16:51:44.390681 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:44.390435 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:51:44.407312 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:44.407177 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-jsp5q" podStartSLOduration=2.899737697 podStartE2EDuration="4.407160473s" podCreationTimestamp="2026-04-16 16:51:40 +0000 UTC" firstStartedPulling="2026-04-16 16:51:41.966569495 +0000 UTC m=+219.135280837" lastFinishedPulling="2026-04-16 16:51:43.473992254 +0000 UTC m=+220.642703613" observedRunningTime="2026-04-16 16:51:44.406305129 +0000 UTC m=+221.575016493" watchObservedRunningTime="2026-04-16 16:51:44.407160473 +0000 UTC m=+221.575871838" Apr 16 16:51:45.394829 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:45.394785 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" event={"ID":"1abd6ff1-83b8-476b-986b-6cf27b7b7da0","Type":"ContainerStarted","Data":"8ded3e8e78e3104ed4beddc7d5ab4250cc7499bc07308b21579a299e336112f6"} Apr 16 16:51:45.395268 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:45.394981 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:51:45.412196 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:45.412147 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" podStartSLOduration=3.279241471 podStartE2EDuration="6.412132346s" podCreationTimestamp="2026-04-16 16:51:39 +0000 UTC" firstStartedPulling="2026-04-16 16:51:41.768290674 +0000 UTC m=+218.937002015" lastFinishedPulling="2026-04-16 16:51:44.901181522 +0000 UTC m=+222.069892890" observedRunningTime="2026-04-16 16:51:45.410278558 +0000 UTC m=+222.578989921" watchObservedRunningTime="2026-04-16 16:51:45.412132346 +0000 UTC m=+222.580843704" Apr 16 16:51:49.407439 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:49.407402 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-t6tng" event={"ID":"f1b2d401-79df-4052-8215-223ea1c2a0a5","Type":"ContainerStarted","Data":"3eccae39a2bbb7dfa5ee042a75be925c3a1d81d7288f992cae8c6a5821c19405"} Apr 16 16:51:49.407869 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:49.407651 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:51:49.427959 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:49.427908 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-t6tng" podStartSLOduration=3.233458695 podStartE2EDuration="10.427893553s" podCreationTimestamp="2026-04-16 16:51:39 +0000 UTC" firstStartedPulling="2026-04-16 16:51:41.41431024 +0000 UTC m=+218.583021581" lastFinishedPulling="2026-04-16 16:51:48.608745084 +0000 UTC m=+225.777456439" observedRunningTime="2026-04-16 16:51:49.427503894 +0000 UTC m=+226.596215272" watchObservedRunningTime="2026-04-16 16:51:49.427893553 +0000 UTC m=+226.596604918" Apr 16 16:51:56.403266 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:51:56.403235 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6d642" Apr 16 16:52:00.371624 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:00.371599 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lfvxf" Apr 16 16:52:05.397665 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:05.397637 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-jsp5q" Apr 16 16:52:10.413214 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:10.413185 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-t6tng" Apr 16 16:52:33.113330 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.113258 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q"] Apr 16 16:52:33.119117 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.119095 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:33.122126 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.122107 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:52:33.122248 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.122131 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:52:33.122248 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.122107 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj6nm\"" Apr 16 16:52:33.125572 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.125549 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q"] Apr 16 16:52:33.168115 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.168088 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwdvc\" (UniqueName: \"kubernetes.io/projected/4f3e27dd-be51-4786-9fde-2d6f337ed698-kube-api-access-jwdvc\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:33.168232 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.168129 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:33.168232 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.168184 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:33.269366 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.269345 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:33.269472 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.269372 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:33.269605 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.269586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwdvc\" (UniqueName: \"kubernetes.io/projected/4f3e27dd-be51-4786-9fde-2d6f337ed698-kube-api-access-jwdvc\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:33.269713 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.269697 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:33.269775 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.269719 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:33.280153 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.280130 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwdvc\" (UniqueName: \"kubernetes.io/projected/4f3e27dd-be51-4786-9fde-2d6f337ed698-kube-api-access-jwdvc\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:33.428915 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.428889 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:33.544017 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:33.543978 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q"] Apr 16 16:52:33.546035 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:52:33.546002 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f3e27dd_be51_4786_9fde_2d6f337ed698.slice/crio-83816f2ff897652f3682786f8df88e202990cfce0c94296c5fefba485230a1df WatchSource:0}: Error finding container 83816f2ff897652f3682786f8df88e202990cfce0c94296c5fefba485230a1df: Status 404 returned error can't find the container with id 83816f2ff897652f3682786f8df88e202990cfce0c94296c5fefba485230a1df Apr 16 16:52:34.545333 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:34.545264 2572 generic.go:358] "Generic (PLEG): container finished" podID="4f3e27dd-be51-4786-9fde-2d6f337ed698" containerID="aee418f820c0e882f511feabef8b82c6ddde34ac3a48603641f07c05515757cb" exitCode=0 Apr 16 16:52:34.545637 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:34.545347 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" event={"ID":"4f3e27dd-be51-4786-9fde-2d6f337ed698","Type":"ContainerDied","Data":"aee418f820c0e882f511feabef8b82c6ddde34ac3a48603641f07c05515757cb"} Apr 16 16:52:34.545637 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:34.545380 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" event={"ID":"4f3e27dd-be51-4786-9fde-2d6f337ed698","Type":"ContainerStarted","Data":"83816f2ff897652f3682786f8df88e202990cfce0c94296c5fefba485230a1df"} Apr 16 16:52:35.549438 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:35.549355 2572 generic.go:358] "Generic (PLEG): container finished" podID="4f3e27dd-be51-4786-9fde-2d6f337ed698" containerID="7fcc0581f4cda282f2f6e68bca3b5adab79d49039f49daafe6221c71e5cdabf5" exitCode=0 Apr 16 16:52:35.549783 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:35.549441 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" event={"ID":"4f3e27dd-be51-4786-9fde-2d6f337ed698","Type":"ContainerDied","Data":"7fcc0581f4cda282f2f6e68bca3b5adab79d49039f49daafe6221c71e5cdabf5"} Apr 16 16:52:36.554705 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:36.554670 2572 generic.go:358] "Generic (PLEG): container finished" podID="4f3e27dd-be51-4786-9fde-2d6f337ed698" containerID="0ad7915653ee4a670349bc5e09d1ce150e5f91427625071569345799b04af115" exitCode=0 Apr 16 16:52:36.555188 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:36.554764 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" event={"ID":"4f3e27dd-be51-4786-9fde-2d6f337ed698","Type":"ContainerDied","Data":"0ad7915653ee4a670349bc5e09d1ce150e5f91427625071569345799b04af115"} Apr 16 16:52:37.670980 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:37.670961 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:37.704295 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:37.704266 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-bundle\") pod \"4f3e27dd-be51-4786-9fde-2d6f337ed698\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " Apr 16 16:52:37.704416 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:37.704310 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwdvc\" (UniqueName: \"kubernetes.io/projected/4f3e27dd-be51-4786-9fde-2d6f337ed698-kube-api-access-jwdvc\") pod \"4f3e27dd-be51-4786-9fde-2d6f337ed698\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " Apr 16 16:52:37.704416 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:37.704343 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-util\") pod \"4f3e27dd-be51-4786-9fde-2d6f337ed698\" (UID: \"4f3e27dd-be51-4786-9fde-2d6f337ed698\") " Apr 16 16:52:37.704875 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:37.704843 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-bundle" (OuterVolumeSpecName: "bundle") pod "4f3e27dd-be51-4786-9fde-2d6f337ed698" (UID: "4f3e27dd-be51-4786-9fde-2d6f337ed698"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:37.706434 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:37.706412 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3e27dd-be51-4786-9fde-2d6f337ed698-kube-api-access-jwdvc" (OuterVolumeSpecName: "kube-api-access-jwdvc") pod "4f3e27dd-be51-4786-9fde-2d6f337ed698" (UID: "4f3e27dd-be51-4786-9fde-2d6f337ed698"). InnerVolumeSpecName "kube-api-access-jwdvc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:52:37.709972 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:37.709864 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-util" (OuterVolumeSpecName: "util") pod "4f3e27dd-be51-4786-9fde-2d6f337ed698" (UID: "4f3e27dd-be51-4786-9fde-2d6f337ed698"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:37.804986 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:37.804946 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-util\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:52:37.804986 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:37.804974 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f3e27dd-be51-4786-9fde-2d6f337ed698-bundle\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:52:37.804986 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:37.804984 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jwdvc\" (UniqueName: \"kubernetes.io/projected/4f3e27dd-be51-4786-9fde-2d6f337ed698-kube-api-access-jwdvc\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:52:38.563187 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:38.563149 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" event={"ID":"4f3e27dd-be51-4786-9fde-2d6f337ed698","Type":"ContainerDied","Data":"83816f2ff897652f3682786f8df88e202990cfce0c94296c5fefba485230a1df"} Apr 16 16:52:38.563187 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:38.563176 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jx78q" Apr 16 16:52:38.563187 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:38.563186 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83816f2ff897652f3682786f8df88e202990cfce0c94296c5fefba485230a1df" Apr 16 16:52:45.556626 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.556588 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv"] Apr 16 16:52:45.556978 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.556871 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f3e27dd-be51-4786-9fde-2d6f337ed698" containerName="pull" Apr 16 16:52:45.556978 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.556883 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3e27dd-be51-4786-9fde-2d6f337ed698" containerName="pull" Apr 16 16:52:45.556978 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.556890 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f3e27dd-be51-4786-9fde-2d6f337ed698" containerName="extract" Apr 16 16:52:45.556978 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.556895 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3e27dd-be51-4786-9fde-2d6f337ed698" containerName="extract" Apr 16 16:52:45.556978 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.556904 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f3e27dd-be51-4786-9fde-2d6f337ed698" containerName="util" Apr 16 16:52:45.556978 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.556910 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3e27dd-be51-4786-9fde-2d6f337ed698" containerName="util" Apr 16 16:52:45.556978 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.556956 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f3e27dd-be51-4786-9fde-2d6f337ed698" containerName="extract" Apr 16 16:52:45.559392 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.559376 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv" Apr 16 16:52:45.562379 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.562352 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:52:45.562484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.562462 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 16:52:45.562545 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.562515 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-84cdf\"" Apr 16 16:52:45.569900 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.569881 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv"] Apr 16 16:52:45.659558 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.659531 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npkt7\" (UniqueName: \"kubernetes.io/projected/5d81cc90-73a9-4226-994f-405012730cfb-kube-api-access-npkt7\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r2wzv\" (UID: \"5d81cc90-73a9-4226-994f-405012730cfb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv" Apr 16 16:52:45.659679 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.659590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d81cc90-73a9-4226-994f-405012730cfb-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r2wzv\" (UID: \"5d81cc90-73a9-4226-994f-405012730cfb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv" Apr 16 16:52:45.760526 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.760499 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d81cc90-73a9-4226-994f-405012730cfb-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r2wzv\" (UID: \"5d81cc90-73a9-4226-994f-405012730cfb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv" Apr 16 16:52:45.760621 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.760546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npkt7\" (UniqueName: \"kubernetes.io/projected/5d81cc90-73a9-4226-994f-405012730cfb-kube-api-access-npkt7\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r2wzv\" (UID: \"5d81cc90-73a9-4226-994f-405012730cfb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv" Apr 16 16:52:45.760926 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.760910 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d81cc90-73a9-4226-994f-405012730cfb-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r2wzv\" (UID: \"5d81cc90-73a9-4226-994f-405012730cfb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv" Apr 16 16:52:45.772695 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.772672 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npkt7\" (UniqueName: \"kubernetes.io/projected/5d81cc90-73a9-4226-994f-405012730cfb-kube-api-access-npkt7\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r2wzv\" (UID: \"5d81cc90-73a9-4226-994f-405012730cfb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv" Apr 16 16:52:45.867797 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.867774 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv" Apr 16 16:52:45.991469 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:45.991441 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv"] Apr 16 16:52:45.992673 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:52:45.992629 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d81cc90_73a9_4226_994f_405012730cfb.slice/crio-bb8c35f8a645fca76a6c916831bf9a96d34e6a041ca3c6bf4954665c018e941a WatchSource:0}: Error finding container bb8c35f8a645fca76a6c916831bf9a96d34e6a041ca3c6bf4954665c018e941a: Status 404 returned error can't find the container with id bb8c35f8a645fca76a6c916831bf9a96d34e6a041ca3c6bf4954665c018e941a Apr 16 16:52:46.588251 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:46.588219 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv" event={"ID":"5d81cc90-73a9-4226-994f-405012730cfb","Type":"ContainerStarted","Data":"bb8c35f8a645fca76a6c916831bf9a96d34e6a041ca3c6bf4954665c018e941a"} Apr 16 16:52:48.595385 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:48.595314 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv" event={"ID":"5d81cc90-73a9-4226-994f-405012730cfb","Type":"ContainerStarted","Data":"321c1176dc249fba8ba39ffa42d645a14e8c2ab30ad8a8fa3e42e19ea5e177b8"} Apr 16 16:52:48.619306 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:48.619256 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r2wzv" podStartSLOduration=1.347173526 podStartE2EDuration="3.619237995s" podCreationTimestamp="2026-04-16 16:52:45 +0000 UTC" firstStartedPulling="2026-04-16 16:52:45.995516273 +0000 UTC m=+283.164227614" lastFinishedPulling="2026-04-16 16:52:48.267580739 +0000 UTC m=+285.436292083" observedRunningTime="2026-04-16 16:52:48.617585446 +0000 UTC m=+285.786296813" watchObservedRunningTime="2026-04-16 16:52:48.619237995 +0000 UTC m=+285.787949359" Apr 16 16:52:55.621505 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.621475 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf"] Apr 16 16:52:55.623871 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.623857 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:52:55.626415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.626393 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:52:55.627493 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.627474 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj6nm\"" Apr 16 16:52:55.627581 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.627490 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:52:55.632240 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.632218 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf"] Apr 16 16:52:55.733002 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.732981 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:52:55.733113 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.733015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmf9n\" (UniqueName: \"kubernetes.io/projected/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-kube-api-access-lmf9n\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:52:55.733113 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.733047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:52:55.833692 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.833663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:52:55.833828 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.833705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmf9n\" (UniqueName: \"kubernetes.io/projected/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-kube-api-access-lmf9n\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:52:55.833974 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.833955 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:52:55.834175 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.834160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:52:55.834266 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.834250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:52:55.842226 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.842198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmf9n\" (UniqueName: \"kubernetes.io/projected/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-kube-api-access-lmf9n\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:52:55.933485 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:55.933433 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:52:56.055794 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:56.055754 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf"] Apr 16 16:52:56.058686 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:52:56.058659 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceefd178_a9b3_45c3_8e52_0b4b40dbfbd0.slice/crio-87648cafc72abde273b166d4ef0696d489e9a7b994a3cafdd66ee5d9baa5f374 WatchSource:0}: Error finding container 87648cafc72abde273b166d4ef0696d489e9a7b994a3cafdd66ee5d9baa5f374: Status 404 returned error can't find the container with id 87648cafc72abde273b166d4ef0696d489e9a7b994a3cafdd66ee5d9baa5f374 Apr 16 16:52:56.624813 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:56.624779 2572 generic.go:358] "Generic (PLEG): container finished" podID="ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" containerID="de95a1028b91550d2434cfb077b853632f768fdef90ba69318e4b6b0bca94427" exitCode=0 Apr 16 16:52:56.625191 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:56.624864 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" event={"ID":"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0","Type":"ContainerDied","Data":"de95a1028b91550d2434cfb077b853632f768fdef90ba69318e4b6b0bca94427"} Apr 16 16:52:56.625191 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:56.624896 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" event={"ID":"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0","Type":"ContainerStarted","Data":"87648cafc72abde273b166d4ef0696d489e9a7b994a3cafdd66ee5d9baa5f374"} Apr 16 16:52:59.636478 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:59.636439 2572 generic.go:358] "Generic (PLEG): container finished" podID="ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" containerID="4c35b215c597f2c60e189318eb0b696460261b1280ef91a4a2b65003da8f2d38" exitCode=0 Apr 16 16:52:59.636828 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:52:59.636524 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" event={"ID":"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0","Type":"ContainerDied","Data":"4c35b215c597f2c60e189318eb0b696460261b1280ef91a4a2b65003da8f2d38"} Apr 16 16:53:00.641616 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:00.641580 2572 generic.go:358] "Generic (PLEG): container finished" podID="ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" containerID="710bf0e419a4737f19c19aa1d7ee6c885256eba27e1206977410ba094e714514" exitCode=0 Apr 16 16:53:00.641981 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:00.641658 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" event={"ID":"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0","Type":"ContainerDied","Data":"710bf0e419a4737f19c19aa1d7ee6c885256eba27e1206977410ba094e714514"} Apr 16 16:53:01.756160 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:01.756134 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:53:01.879592 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:01.879562 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-util\") pod \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " Apr 16 16:53:01.879764 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:01.879610 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmf9n\" (UniqueName: \"kubernetes.io/projected/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-kube-api-access-lmf9n\") pod \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " Apr 16 16:53:01.879764 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:01.879632 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-bundle\") pod \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\" (UID: \"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0\") " Apr 16 16:53:01.880027 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:01.880003 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-bundle" (OuterVolumeSpecName: "bundle") pod "ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" (UID: "ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:53:01.881521 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:01.881492 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-kube-api-access-lmf9n" (OuterVolumeSpecName: "kube-api-access-lmf9n") pod "ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" (UID: "ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0"). InnerVolumeSpecName "kube-api-access-lmf9n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:53:01.883869 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:01.883848 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-util" (OuterVolumeSpecName: "util") pod "ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" (UID: "ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:53:01.980136 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:01.980053 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-util\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:53:01.980136 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:01.980095 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lmf9n\" (UniqueName: \"kubernetes.io/projected/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-kube-api-access-lmf9n\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:53:01.980136 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:01.980109 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0-bundle\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:53:02.650599 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:02.650569 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" event={"ID":"ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0","Type":"ContainerDied","Data":"87648cafc72abde273b166d4ef0696d489e9a7b994a3cafdd66ee5d9baa5f374"} Apr 16 16:53:02.650599 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:02.650599 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87648cafc72abde273b166d4ef0696d489e9a7b994a3cafdd66ee5d9baa5f374" Apr 16 16:53:02.650900 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:02.650638 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f27pcf" Apr 16 16:53:03.465867 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:03.465837 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 16:53:03.466342 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:03.465880 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 16:53:03.469159 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:03.469140 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:53:08.027643 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.027607 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d"] Apr 16 16:53:08.032393 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.027921 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" containerName="extract" Apr 16 16:53:08.032393 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.027933 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" containerName="extract" Apr 16 16:53:08.032393 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.027941 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" containerName="util" Apr 16 16:53:08.032393 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.027946 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" containerName="util" Apr 16 16:53:08.032393 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.027960 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" containerName="pull" Apr 16 16:53:08.032393 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.027965 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" containerName="pull" Apr 16 16:53:08.032393 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.028012 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ceefd178-a9b3-45c3-8e52-0b4b40dbfbd0" containerName="extract" Apr 16 16:53:08.033311 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.033296 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d" Apr 16 16:53:08.036985 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.036957 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 16:53:08.036985 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.036984 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:53:08.037201 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.037009 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-4hxsd\"" Apr 16 16:53:08.037553 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.037535 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d"] Apr 16 16:53:08.126686 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.126662 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vlz7\" (UniqueName: \"kubernetes.io/projected/148930b9-33c3-45fb-9cce-7707e5682e08-kube-api-access-2vlz7\") pod \"openshift-lws-operator-bfc7f696d-dn95d\" (UID: \"148930b9-33c3-45fb-9cce-7707e5682e08\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d" Apr 16 16:53:08.126829 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.126712 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/148930b9-33c3-45fb-9cce-7707e5682e08-tmp\") pod \"openshift-lws-operator-bfc7f696d-dn95d\" (UID: \"148930b9-33c3-45fb-9cce-7707e5682e08\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d" Apr 16 16:53:08.227199 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.227169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/148930b9-33c3-45fb-9cce-7707e5682e08-tmp\") pod \"openshift-lws-operator-bfc7f696d-dn95d\" (UID: \"148930b9-33c3-45fb-9cce-7707e5682e08\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d" Apr 16 16:53:08.227337 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.227219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vlz7\" (UniqueName: \"kubernetes.io/projected/148930b9-33c3-45fb-9cce-7707e5682e08-kube-api-access-2vlz7\") pod \"openshift-lws-operator-bfc7f696d-dn95d\" (UID: \"148930b9-33c3-45fb-9cce-7707e5682e08\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d" Apr 16 16:53:08.227532 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.227513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/148930b9-33c3-45fb-9cce-7707e5682e08-tmp\") pod \"openshift-lws-operator-bfc7f696d-dn95d\" (UID: \"148930b9-33c3-45fb-9cce-7707e5682e08\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d" Apr 16 16:53:08.237773 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.237751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vlz7\" (UniqueName: \"kubernetes.io/projected/148930b9-33c3-45fb-9cce-7707e5682e08-kube-api-access-2vlz7\") pod \"openshift-lws-operator-bfc7f696d-dn95d\" (UID: \"148930b9-33c3-45fb-9cce-7707e5682e08\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d" Apr 16 16:53:08.351713 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.351649 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d" Apr 16 16:53:08.464705 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.464681 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d"] Apr 16 16:53:08.466836 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:53:08.466807 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod148930b9_33c3_45fb_9cce_7707e5682e08.slice/crio-7518d643ec485ce0269acd10afa94df0958358743b1ae41991fbaf2a19232a92 WatchSource:0}: Error finding container 7518d643ec485ce0269acd10afa94df0958358743b1ae41991fbaf2a19232a92: Status 404 returned error can't find the container with id 7518d643ec485ce0269acd10afa94df0958358743b1ae41991fbaf2a19232a92 Apr 16 16:53:08.468580 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.468562 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:53:08.669196 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:08.669163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d" event={"ID":"148930b9-33c3-45fb-9cce-7707e5682e08","Type":"ContainerStarted","Data":"7518d643ec485ce0269acd10afa94df0958358743b1ae41991fbaf2a19232a92"} Apr 16 16:53:10.676201 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:10.676163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d" event={"ID":"148930b9-33c3-45fb-9cce-7707e5682e08","Type":"ContainerStarted","Data":"6a5f4603a15af0fe35ce775798a5f2e793bbe982bfb3c1ef9cdf0d28e1f09cce"} Apr 16 16:53:10.692467 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:10.692423 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dn95d" podStartSLOduration=1.010429117 podStartE2EDuration="2.692409374s" podCreationTimestamp="2026-04-16 16:53:08 +0000 UTC" firstStartedPulling="2026-04-16 16:53:08.468682047 +0000 UTC m=+305.637393388" lastFinishedPulling="2026-04-16 16:53:10.150662304 +0000 UTC m=+307.319373645" observedRunningTime="2026-04-16 16:53:10.691130025 +0000 UTC m=+307.859841394" watchObservedRunningTime="2026-04-16 16:53:10.692409374 +0000 UTC m=+307.861120736" Apr 16 16:53:24.022009 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.021977 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7"] Apr 16 16:53:24.024539 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.024518 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:24.027382 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.027363 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:53:24.027477 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.027414 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj6nm\"" Apr 16 16:53:24.027602 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.027583 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:53:24.035916 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.035893 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7"] Apr 16 16:53:24.149381 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.149355 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2v4w\" (UniqueName: \"kubernetes.io/projected/0218a0fd-e140-4cc5-80e4-de10f6d604ab-kube-api-access-z2v4w\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:24.149491 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.149392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:24.149491 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.149426 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:24.250412 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.250377 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2v4w\" (UniqueName: \"kubernetes.io/projected/0218a0fd-e140-4cc5-80e4-de10f6d604ab-kube-api-access-z2v4w\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:24.250412 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.250418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:24.250555 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.250447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:24.250740 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.250724 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:24.250779 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.250751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:24.258762 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.258733 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2v4w\" (UniqueName: \"kubernetes.io/projected/0218a0fd-e140-4cc5-80e4-de10f6d604ab-kube-api-access-z2v4w\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:24.337038 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.336992 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:24.449930 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.449905 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7"] Apr 16 16:53:24.452074 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:53:24.452034 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0218a0fd_e140_4cc5_80e4_de10f6d604ab.slice/crio-e1a93c3334da98f9ce7455b554c7c6c2345b156481ea67c696e0c872d93aa0bd WatchSource:0}: Error finding container e1a93c3334da98f9ce7455b554c7c6c2345b156481ea67c696e0c872d93aa0bd: Status 404 returned error can't find the container with id e1a93c3334da98f9ce7455b554c7c6c2345b156481ea67c696e0c872d93aa0bd Apr 16 16:53:24.719173 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.719148 2572 generic.go:358] "Generic (PLEG): container finished" podID="0218a0fd-e140-4cc5-80e4-de10f6d604ab" containerID="4c97773ab6e42daba9e754d98b9c883ae4b80c904557715392de2162cfe5be15" exitCode=0 Apr 16 16:53:24.719291 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.719233 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" event={"ID":"0218a0fd-e140-4cc5-80e4-de10f6d604ab","Type":"ContainerDied","Data":"4c97773ab6e42daba9e754d98b9c883ae4b80c904557715392de2162cfe5be15"} Apr 16 16:53:24.719291 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:24.719267 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" event={"ID":"0218a0fd-e140-4cc5-80e4-de10f6d604ab","Type":"ContainerStarted","Data":"e1a93c3334da98f9ce7455b554c7c6c2345b156481ea67c696e0c872d93aa0bd"} Apr 16 16:53:26.726735 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:26.726702 2572 generic.go:358] "Generic (PLEG): container finished" podID="0218a0fd-e140-4cc5-80e4-de10f6d604ab" containerID="07f35d1bf20eeb998e176db9140820a7c62fab72363b8037e19e63d2c4e99c65" exitCode=0 Apr 16 16:53:26.727095 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:26.726782 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" event={"ID":"0218a0fd-e140-4cc5-80e4-de10f6d604ab","Type":"ContainerDied","Data":"07f35d1bf20eeb998e176db9140820a7c62fab72363b8037e19e63d2c4e99c65"} Apr 16 16:53:27.731912 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:27.731873 2572 generic.go:358] "Generic (PLEG): container finished" podID="0218a0fd-e140-4cc5-80e4-de10f6d604ab" containerID="4d9e2c53e9c79bed91018ad3111a0eab7c9cbdfdac6a8783656fa25873322a5e" exitCode=0 Apr 16 16:53:27.732364 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:27.731946 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" event={"ID":"0218a0fd-e140-4cc5-80e4-de10f6d604ab","Type":"ContainerDied","Data":"4d9e2c53e9c79bed91018ad3111a0eab7c9cbdfdac6a8783656fa25873322a5e"} Apr 16 16:53:28.856108 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:28.856088 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:28.979322 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:28.979298 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-util\") pod \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " Apr 16 16:53:28.979436 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:28.979345 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-bundle\") pod \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " Apr 16 16:53:28.979436 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:28.979365 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2v4w\" (UniqueName: \"kubernetes.io/projected/0218a0fd-e140-4cc5-80e4-de10f6d604ab-kube-api-access-z2v4w\") pod \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\" (UID: \"0218a0fd-e140-4cc5-80e4-de10f6d604ab\") " Apr 16 16:53:28.980173 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:28.980141 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-bundle" (OuterVolumeSpecName: "bundle") pod "0218a0fd-e140-4cc5-80e4-de10f6d604ab" (UID: "0218a0fd-e140-4cc5-80e4-de10f6d604ab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:53:28.981324 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:28.981295 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0218a0fd-e140-4cc5-80e4-de10f6d604ab-kube-api-access-z2v4w" (OuterVolumeSpecName: "kube-api-access-z2v4w") pod "0218a0fd-e140-4cc5-80e4-de10f6d604ab" (UID: "0218a0fd-e140-4cc5-80e4-de10f6d604ab"). InnerVolumeSpecName "kube-api-access-z2v4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:53:28.984298 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:28.984273 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-util" (OuterVolumeSpecName: "util") pod "0218a0fd-e140-4cc5-80e4-de10f6d604ab" (UID: "0218a0fd-e140-4cc5-80e4-de10f6d604ab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:53:29.080772 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:29.080712 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-util\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:53:29.080772 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:29.080761 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0218a0fd-e140-4cc5-80e4-de10f6d604ab-bundle\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:53:29.080772 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:29.080772 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z2v4w\" (UniqueName: \"kubernetes.io/projected/0218a0fd-e140-4cc5-80e4-de10f6d604ab-kube-api-access-z2v4w\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:53:29.740566 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:29.740531 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" event={"ID":"0218a0fd-e140-4cc5-80e4-de10f6d604ab","Type":"ContainerDied","Data":"e1a93c3334da98f9ce7455b554c7c6c2345b156481ea67c696e0c872d93aa0bd"} Apr 16 16:53:29.740566 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:29.740568 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1a93c3334da98f9ce7455b554c7c6c2345b156481ea67c696e0c872d93aa0bd" Apr 16 16:53:29.740732 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:29.740575 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9bp7" Apr 16 16:53:38.695009 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.694975 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7"] Apr 16 16:53:38.695404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.695263 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0218a0fd-e140-4cc5-80e4-de10f6d604ab" containerName="extract" Apr 16 16:53:38.695404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.695274 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0218a0fd-e140-4cc5-80e4-de10f6d604ab" containerName="extract" Apr 16 16:53:38.695404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.695286 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0218a0fd-e140-4cc5-80e4-de10f6d604ab" containerName="util" Apr 16 16:53:38.695404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.695291 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0218a0fd-e140-4cc5-80e4-de10f6d604ab" containerName="util" Apr 16 16:53:38.695404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.695299 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0218a0fd-e140-4cc5-80e4-de10f6d604ab" containerName="pull" Apr 16 16:53:38.695404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.695320 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0218a0fd-e140-4cc5-80e4-de10f6d604ab" containerName="pull" Apr 16 16:53:38.695404 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.695366 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0218a0fd-e140-4cc5-80e4-de10f6d604ab" containerName="extract" Apr 16 16:53:38.697947 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.697929 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:38.701745 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.701724 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:53:38.702753 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.702728 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:53:38.702840 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.702758 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj6nm\"" Apr 16 16:53:38.713976 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.713954 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7"] Apr 16 16:53:38.853342 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.853312 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:38.853484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.853355 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4fww\" (UniqueName: \"kubernetes.io/projected/657ca16f-2e54-41d9-b3a6-c08e15fde96c-kube-api-access-k4fww\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:38.853484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.853408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:38.953841 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.953760 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4fww\" (UniqueName: \"kubernetes.io/projected/657ca16f-2e54-41d9-b3a6-c08e15fde96c-kube-api-access-k4fww\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:38.953841 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.953795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:38.954044 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.953842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:38.954145 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.954131 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:38.954237 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.954219 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:38.974704 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:38.974680 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4fww\" (UniqueName: \"kubernetes.io/projected/657ca16f-2e54-41d9-b3a6-c08e15fde96c-kube-api-access-k4fww\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:39.005987 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:39.005966 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:39.347197 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:39.347140 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7"] Apr 16 16:53:39.349887 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:53:39.349861 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod657ca16f_2e54_41d9_b3a6_c08e15fde96c.slice/crio-6da0b56e970f68674406f0cbaf7dbba8a6de0db1d1b2335ddc361a56aba89cdf WatchSource:0}: Error finding container 6da0b56e970f68674406f0cbaf7dbba8a6de0db1d1b2335ddc361a56aba89cdf: Status 404 returned error can't find the container with id 6da0b56e970f68674406f0cbaf7dbba8a6de0db1d1b2335ddc361a56aba89cdf Apr 16 16:53:39.777620 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:39.777584 2572 generic.go:358] "Generic (PLEG): container finished" podID="657ca16f-2e54-41d9-b3a6-c08e15fde96c" containerID="d3599e139f80e706cdbf3bcfa714508cf9ba1bbf9f0e3e989b61240077781e95" exitCode=0 Apr 16 16:53:39.778118 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:39.777668 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" event={"ID":"657ca16f-2e54-41d9-b3a6-c08e15fde96c","Type":"ContainerDied","Data":"d3599e139f80e706cdbf3bcfa714508cf9ba1bbf9f0e3e989b61240077781e95"} Apr 16 16:53:39.778118 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:39.777699 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" event={"ID":"657ca16f-2e54-41d9-b3a6-c08e15fde96c","Type":"ContainerStarted","Data":"6da0b56e970f68674406f0cbaf7dbba8a6de0db1d1b2335ddc361a56aba89cdf"} Apr 16 16:53:39.951981 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:39.951949 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-zhckx"] Apr 16 16:53:39.953987 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:39.953973 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" Apr 16 16:53:39.956889 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:39.956872 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-v8l6m\"" Apr 16 16:53:39.957418 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:39.957400 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 16:53:39.957471 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:39.957454 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 16:53:39.984998 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:39.984978 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-zhckx"] Apr 16 16:53:40.065477 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:40.065419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8rqb\" (UniqueName: \"kubernetes.io/projected/4f535a89-d406-4d79-9786-28a99e843b8e-kube-api-access-j8rqb\") pod \"servicemesh-operator3-55f49c5f94-zhckx\" (UID: \"4f535a89-d406-4d79-9786-28a99e843b8e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" Apr 16 16:53:40.065578 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:40.065482 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4f535a89-d406-4d79-9786-28a99e843b8e-operator-config\") pod \"servicemesh-operator3-55f49c5f94-zhckx\" (UID: \"4f535a89-d406-4d79-9786-28a99e843b8e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" Apr 16 16:53:40.166102 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:40.166047 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8rqb\" (UniqueName: \"kubernetes.io/projected/4f535a89-d406-4d79-9786-28a99e843b8e-kube-api-access-j8rqb\") pod \"servicemesh-operator3-55f49c5f94-zhckx\" (UID: \"4f535a89-d406-4d79-9786-28a99e843b8e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" Apr 16 16:53:40.166244 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:40.166138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4f535a89-d406-4d79-9786-28a99e843b8e-operator-config\") pod \"servicemesh-operator3-55f49c5f94-zhckx\" (UID: \"4f535a89-d406-4d79-9786-28a99e843b8e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" Apr 16 16:53:40.168373 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:40.168356 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4f535a89-d406-4d79-9786-28a99e843b8e-operator-config\") pod \"servicemesh-operator3-55f49c5f94-zhckx\" (UID: \"4f535a89-d406-4d79-9786-28a99e843b8e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" Apr 16 16:53:40.174549 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:40.174527 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8rqb\" (UniqueName: \"kubernetes.io/projected/4f535a89-d406-4d79-9786-28a99e843b8e-kube-api-access-j8rqb\") pod \"servicemesh-operator3-55f49c5f94-zhckx\" (UID: \"4f535a89-d406-4d79-9786-28a99e843b8e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" Apr 16 16:53:40.263343 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:40.263308 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" Apr 16 16:53:40.388416 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:40.388393 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-zhckx"] Apr 16 16:53:40.392503 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:53:40.392473 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f535a89_d406_4d79_9786_28a99e843b8e.slice/crio-357bcc6cbd445ad3056189c154c5779eb4bd14482b6571c0861d36602b51b13b WatchSource:0}: Error finding container 357bcc6cbd445ad3056189c154c5779eb4bd14482b6571c0861d36602b51b13b: Status 404 returned error can't find the container with id 357bcc6cbd445ad3056189c154c5779eb4bd14482b6571c0861d36602b51b13b Apr 16 16:53:40.783629 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:40.783596 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" event={"ID":"4f535a89-d406-4d79-9786-28a99e843b8e","Type":"ContainerStarted","Data":"357bcc6cbd445ad3056189c154c5779eb4bd14482b6571c0861d36602b51b13b"} Apr 16 16:53:40.785252 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:40.785228 2572 generic.go:358] "Generic (PLEG): container finished" podID="657ca16f-2e54-41d9-b3a6-c08e15fde96c" containerID="62180ec7d96a618ed67088fc421d8389dac36f919d4f0735856aa1a73611890a" exitCode=0 Apr 16 16:53:40.785357 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:40.785306 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" event={"ID":"657ca16f-2e54-41d9-b3a6-c08e15fde96c","Type":"ContainerDied","Data":"62180ec7d96a618ed67088fc421d8389dac36f919d4f0735856aa1a73611890a"} Apr 16 16:53:41.790284 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:41.790249 2572 generic.go:358] "Generic (PLEG): container finished" podID="657ca16f-2e54-41d9-b3a6-c08e15fde96c" containerID="76131b4536af01ade73bdb339323c4fa3c34261bac846e626da07869bd1d0461" exitCode=0 Apr 16 16:53:41.790652 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:41.790313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" event={"ID":"657ca16f-2e54-41d9-b3a6-c08e15fde96c","Type":"ContainerDied","Data":"76131b4536af01ade73bdb339323c4fa3c34261bac846e626da07869bd1d0461"} Apr 16 16:53:43.024232 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.024202 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:43.189231 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.189194 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-util\") pod \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " Apr 16 16:53:43.189412 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.189238 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-bundle\") pod \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " Apr 16 16:53:43.189412 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.189276 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4fww\" (UniqueName: \"kubernetes.io/projected/657ca16f-2e54-41d9-b3a6-c08e15fde96c-kube-api-access-k4fww\") pod \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\" (UID: \"657ca16f-2e54-41d9-b3a6-c08e15fde96c\") " Apr 16 16:53:43.190134 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.190100 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-bundle" (OuterVolumeSpecName: "bundle") pod "657ca16f-2e54-41d9-b3a6-c08e15fde96c" (UID: "657ca16f-2e54-41d9-b3a6-c08e15fde96c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:53:43.191525 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.191501 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657ca16f-2e54-41d9-b3a6-c08e15fde96c-kube-api-access-k4fww" (OuterVolumeSpecName: "kube-api-access-k4fww") pod "657ca16f-2e54-41d9-b3a6-c08e15fde96c" (UID: "657ca16f-2e54-41d9-b3a6-c08e15fde96c"). InnerVolumeSpecName "kube-api-access-k4fww". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:53:43.194887 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.194863 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-util" (OuterVolumeSpecName: "util") pod "657ca16f-2e54-41d9-b3a6-c08e15fde96c" (UID: "657ca16f-2e54-41d9-b3a6-c08e15fde96c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:53:43.289877 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.289843 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-util\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:53:43.289877 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.289872 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/657ca16f-2e54-41d9-b3a6-c08e15fde96c-bundle\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:53:43.289877 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.289882 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k4fww\" (UniqueName: \"kubernetes.io/projected/657ca16f-2e54-41d9-b3a6-c08e15fde96c-kube-api-access-k4fww\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:53:43.799215 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.799180 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" event={"ID":"657ca16f-2e54-41d9-b3a6-c08e15fde96c","Type":"ContainerDied","Data":"6da0b56e970f68674406f0cbaf7dbba8a6de0db1d1b2335ddc361a56aba89cdf"} Apr 16 16:53:43.799215 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.799218 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da0b56e970f68674406f0cbaf7dbba8a6de0db1d1b2335ddc361a56aba89cdf" Apr 16 16:53:43.799215 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.799193 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c27c9j7" Apr 16 16:53:43.800748 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.800719 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" event={"ID":"4f535a89-d406-4d79-9786-28a99e843b8e","Type":"ContainerStarted","Data":"447ef29c35dd5b36977bdce1d5469b7685dbc733e3ea629007716f17fdf0d651"} Apr 16 16:53:43.800894 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.800881 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" Apr 16 16:53:43.825894 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:43.825850 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" podStartSLOduration=2.177144814 podStartE2EDuration="4.825837909s" podCreationTimestamp="2026-04-16 16:53:39 +0000 UTC" firstStartedPulling="2026-04-16 16:53:40.395466747 +0000 UTC m=+337.564178088" lastFinishedPulling="2026-04-16 16:53:43.044159841 +0000 UTC m=+340.212871183" observedRunningTime="2026-04-16 16:53:43.824034577 +0000 UTC m=+340.992745932" watchObservedRunningTime="2026-04-16 16:53:43.825837909 +0000 UTC m=+340.994549271" Apr 16 16:53:47.613837 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.613756 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9"] Apr 16 16:53:47.614298 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.614038 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="657ca16f-2e54-41d9-b3a6-c08e15fde96c" containerName="pull" Apr 16 16:53:47.614298 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.614047 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="657ca16f-2e54-41d9-b3a6-c08e15fde96c" containerName="pull" Apr 16 16:53:47.614298 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.614092 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="657ca16f-2e54-41d9-b3a6-c08e15fde96c" containerName="util" Apr 16 16:53:47.614298 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.614098 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="657ca16f-2e54-41d9-b3a6-c08e15fde96c" containerName="util" Apr 16 16:53:47.614298 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.614104 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="657ca16f-2e54-41d9-b3a6-c08e15fde96c" containerName="extract" Apr 16 16:53:47.614298 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.614110 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="657ca16f-2e54-41d9-b3a6-c08e15fde96c" containerName="extract" Apr 16 16:53:47.614298 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.614159 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="657ca16f-2e54-41d9-b3a6-c08e15fde96c" containerName="extract" Apr 16 16:53:47.615899 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.615884 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.618669 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.618647 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:53:47.618669 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.618660 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 16:53:47.618826 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.618647 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 16:53:47.618826 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.618655 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 16:53:47.619040 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.619019 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:53:47.619889 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.619873 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 16:53:47.619981 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.619877 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-2mwsx\"" Apr 16 16:53:47.633415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.633392 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9"] Apr 16 16:53:47.725040 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.725003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/18cecd37-afce-493b-8a69-3b54afe5f0fd-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.725233 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.725113 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.725233 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.725138 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.725233 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.725182 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.725233 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.725207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.725443 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.725258 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.725443 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.725286 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgrkl\" (UniqueName: \"kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-kube-api-access-bgrkl\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.826298 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.826263 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.826476 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.826310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.826476 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.826348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.826476 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.826371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.826648 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.826496 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.826648 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.826539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgrkl\" (UniqueName: \"kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-kube-api-access-bgrkl\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.826648 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.826591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/18cecd37-afce-493b-8a69-3b54afe5f0fd-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.826934 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.826907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.828813 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.828789 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.828948 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.828883 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.829033 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.829012 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.829133 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.829087 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/18cecd37-afce-493b-8a69-3b54afe5f0fd-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.834168 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.834141 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.834773 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.834756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgrkl\" (UniqueName: \"kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-kube-api-access-bgrkl\") pod \"istiod-openshift-gateway-7cd77c7ffd-wcfq9\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:47.924818 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:47.924785 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:48.057183 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:48.054683 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9"] Apr 16 16:53:48.817532 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:48.817479 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" event={"ID":"18cecd37-afce-493b-8a69-3b54afe5f0fd","Type":"ContainerStarted","Data":"a3b5049dec6096ace8b83a177558b2f46ac034465875de20266906a47c84aff6"} Apr 16 16:53:50.439310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:50.439276 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 16:53:50.439537 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:50.439344 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 16:53:50.825578 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:50.825480 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" event={"ID":"18cecd37-afce-493b-8a69-3b54afe5f0fd","Type":"ContainerStarted","Data":"080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e"} Apr 16 16:53:50.825753 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:50.825607 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:50.848372 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:50.848325 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" podStartSLOduration=1.469971881 podStartE2EDuration="3.848311239s" podCreationTimestamp="2026-04-16 16:53:47 +0000 UTC" firstStartedPulling="2026-04-16 16:53:48.060651359 +0000 UTC m=+345.229362702" lastFinishedPulling="2026-04-16 16:53:50.438990711 +0000 UTC m=+347.607702060" observedRunningTime="2026-04-16 16:53:50.846329988 +0000 UTC m=+348.015041351" watchObservedRunningTime="2026-04-16 16:53:50.848311239 +0000 UTC m=+348.017022601" Apr 16 16:53:51.831000 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:51.830970 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:53:54.500742 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.500706 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn"] Apr 16 16:53:54.503039 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.503023 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.505885 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.505864 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-dfpjh\"" Apr 16 16:53:54.517129 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.517103 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn"] Apr 16 16:53:54.582606 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.582568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.582606 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.582602 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.582809 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.582628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c11ab2b7-a15d-45b9-95e3-690208f5d272-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.582809 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.582645 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.582809 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.582701 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.582809 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.582729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.582809 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.582743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8d6d\" (UniqueName: \"kubernetes.io/projected/c11ab2b7-a15d-45b9-95e3-690208f5d272-kube-api-access-w8d6d\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.582809 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.582763 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.582809 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.582787 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.683362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.683323 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.683519 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.683373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.683519 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.683416 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c11ab2b7-a15d-45b9-95e3-690208f5d272-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.683519 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.683443 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.683519 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.683480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.683736 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.683519 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.683736 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.683545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8d6d\" (UniqueName: \"kubernetes.io/projected/c11ab2b7-a15d-45b9-95e3-690208f5d272-kube-api-access-w8d6d\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.683736 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.683580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.683736 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.683631 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.683921 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.683881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.684116 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.684094 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.684227 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.684204 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c11ab2b7-a15d-45b9-95e3-690208f5d272-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.684330 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.684309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.684454 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.684437 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.685733 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.685706 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.686028 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.686009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.691441 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.691417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c11ab2b7-a15d-45b9-95e3-690208f5d272-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.691816 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.691795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8d6d\" (UniqueName: \"kubernetes.io/projected/c11ab2b7-a15d-45b9-95e3-690208f5d272-kube-api-access-w8d6d\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-pm5bn\" (UID: \"c11ab2b7-a15d-45b9-95e3-690208f5d272\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.806521 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.806451 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-zhckx" Apr 16 16:53:54.816748 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.816726 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:54.948893 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:54.948861 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn"] Apr 16 16:53:54.951985 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:53:54.951948 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc11ab2b7_a15d_45b9_95e3_690208f5d272.slice/crio-472005e8fca30e6e2dbcbf5f58ccb240199d2c654e0a3db074a08b9cb8d3b60a WatchSource:0}: Error finding container 472005e8fca30e6e2dbcbf5f58ccb240199d2c654e0a3db074a08b9cb8d3b60a: Status 404 returned error can't find the container with id 472005e8fca30e6e2dbcbf5f58ccb240199d2c654e0a3db074a08b9cb8d3b60a Apr 16 16:53:55.842827 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:55.842789 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" event={"ID":"c11ab2b7-a15d-45b9-95e3-690208f5d272","Type":"ContainerStarted","Data":"472005e8fca30e6e2dbcbf5f58ccb240199d2c654e0a3db074a08b9cb8d3b60a"} Apr 16 16:53:57.688921 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:57.688891 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 16:53:57.689198 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:57.688959 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 16:53:57.689198 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:57.688986 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 16:53:57.852362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:57.852322 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" event={"ID":"c11ab2b7-a15d-45b9-95e3-690208f5d272","Type":"ContainerStarted","Data":"b51969ca4517f511735c4c79c380f0faf197b0b08c0b977f5cae0785b05167c7"} Apr 16 16:53:57.881542 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:57.881493 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" podStartSLOduration=1.14698175 podStartE2EDuration="3.881479293s" podCreationTimestamp="2026-04-16 16:53:54 +0000 UTC" firstStartedPulling="2026-04-16 16:53:54.954155659 +0000 UTC m=+352.122867000" lastFinishedPulling="2026-04-16 16:53:57.688653199 +0000 UTC m=+354.857364543" observedRunningTime="2026-04-16 16:53:57.879173041 +0000 UTC m=+355.047884420" watchObservedRunningTime="2026-04-16 16:53:57.881479293 +0000 UTC m=+355.050190655" Apr 16 16:53:58.817296 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:58.817264 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:58.821896 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:58.821870 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:58.855931 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:58.855906 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:53:58.856630 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:53:58.856611 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-pm5bn" Apr 16 16:54:02.107504 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.107475 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4"] Apr 16 16:54:02.109927 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.109910 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:02.112511 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.112486 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:54:02.113784 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.113761 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj6nm\"" Apr 16 16:54:02.113887 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.113793 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:54:02.118749 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.118727 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4"] Apr 16 16:54:02.141869 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.141850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:02.141959 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.141885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:02.141959 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.141923 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpmcz\" (UniqueName: \"kubernetes.io/projected/c8245f56-6b44-4411-a8ef-ddd146797d15-kube-api-access-bpmcz\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:02.189155 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.189132 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv"] Apr 16 16:54:02.191280 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.191267 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:02.200000 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.199981 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv"] Apr 16 16:54:02.242688 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.242665 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:02.242783 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.242700 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpmcz\" (UniqueName: \"kubernetes.io/projected/c8245f56-6b44-4411-a8ef-ddd146797d15-kube-api-access-bpmcz\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:02.242783 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.242727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6n46\" (UniqueName: \"kubernetes.io/projected/9eb041de-9aa8-4afe-8cc5-277411465377-kube-api-access-h6n46\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:02.242783 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.242757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:02.242925 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.242790 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:02.242925 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.242873 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:02.243172 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.243152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:02.243211 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.243193 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:02.250262 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.250241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpmcz\" (UniqueName: \"kubernetes.io/projected/c8245f56-6b44-4411-a8ef-ddd146797d15-kube-api-access-bpmcz\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:02.289773 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.289752 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t"] Apr 16 16:54:02.292221 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.292207 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:02.300442 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.300422 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t"] Apr 16 16:54:02.343658 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.343632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:02.343762 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.343714 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:02.343762 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.343753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:02.343863 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.343781 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8x9z\" (UniqueName: \"kubernetes.io/projected/dda337df-05d0-4443-984a-0cf00fe34cc4-kube-api-access-j8x9z\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:02.343910 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.343855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6n46\" (UniqueName: \"kubernetes.io/projected/9eb041de-9aa8-4afe-8cc5-277411465377-kube-api-access-h6n46\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:02.343910 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.343883 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:02.344009 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.343963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:02.344009 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.343987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:02.351892 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.351872 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6n46\" (UniqueName: \"kubernetes.io/projected/9eb041de-9aa8-4afe-8cc5-277411465377-kube-api-access-h6n46\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:02.391756 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.391733 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl"] Apr 16 16:54:02.394242 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.394229 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:02.403133 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.403115 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl"] Apr 16 16:54:02.421642 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.421606 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:02.444643 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.444618 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:02.444742 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.444651 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8x9z\" (UniqueName: \"kubernetes.io/projected/dda337df-05d0-4443-984a-0cf00fe34cc4-kube-api-access-j8x9z\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:02.444742 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.444675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:02.444742 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.444705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:02.444859 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.444747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:02.444859 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.444840 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w46l8\" (UniqueName: \"kubernetes.io/projected/6afffe93-3a45-44b4-8377-377795c630d6-kube-api-access-w46l8\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:02.445080 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.445039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:02.445117 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.445051 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:02.453480 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.453444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8x9z\" (UniqueName: \"kubernetes.io/projected/dda337df-05d0-4443-984a-0cf00fe34cc4-kube-api-access-j8x9z\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:02.499743 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.499714 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:02.543116 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.543082 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4"] Apr 16 16:54:02.543520 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:54:02.543497 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8245f56_6b44_4411_a8ef_ddd146797d15.slice/crio-038d37d363619a411f4c553772ed435f2ce3312e48e7f8d5a9cbb7ade4d4a2c6 WatchSource:0}: Error finding container 038d37d363619a411f4c553772ed435f2ce3312e48e7f8d5a9cbb7ade4d4a2c6: Status 404 returned error can't find the container with id 038d37d363619a411f4c553772ed435f2ce3312e48e7f8d5a9cbb7ade4d4a2c6 Apr 16 16:54:02.545426 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.545404 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:02.545495 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.545454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:02.545535 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.545497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w46l8\" (UniqueName: \"kubernetes.io/projected/6afffe93-3a45-44b4-8377-377795c630d6-kube-api-access-w46l8\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:02.545845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.545781 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:02.545845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.545811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:02.555995 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.555951 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w46l8\" (UniqueName: \"kubernetes.io/projected/6afffe93-3a45-44b4-8377-377795c630d6-kube-api-access-w46l8\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:02.601824 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.601768 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:02.628274 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.628248 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv"] Apr 16 16:54:02.630256 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:54:02.630222 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eb041de_9aa8_4afe_8cc5_277411465377.slice/crio-f7d6dffffc52f5da1dc241734a36ebac21cc87452c0c9c96520e36d0bec5d684 WatchSource:0}: Error finding container f7d6dffffc52f5da1dc241734a36ebac21cc87452c0c9c96520e36d0bec5d684: Status 404 returned error can't find the container with id f7d6dffffc52f5da1dc241734a36ebac21cc87452c0c9c96520e36d0bec5d684 Apr 16 16:54:02.703395 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.703369 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:02.724967 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.724938 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t"] Apr 16 16:54:02.727158 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:54:02.727130 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddda337df_05d0_4443_984a_0cf00fe34cc4.slice/crio-ce684f0019b9d8f82970ea5a6c2bafba08f791a56981ba832fb5c102a8e3dc2f WatchSource:0}: Error finding container ce684f0019b9d8f82970ea5a6c2bafba08f791a56981ba832fb5c102a8e3dc2f: Status 404 returned error can't find the container with id ce684f0019b9d8f82970ea5a6c2bafba08f791a56981ba832fb5c102a8e3dc2f Apr 16 16:54:02.836469 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.836446 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl"] Apr 16 16:54:02.837989 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:54:02.837964 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6afffe93_3a45_44b4_8377_377795c630d6.slice/crio-d37d8af35d2ded9defeb1b2d338f01c6c8c842908b0a1b3cdc421cad94e5ca94 WatchSource:0}: Error finding container d37d8af35d2ded9defeb1b2d338f01c6c8c842908b0a1b3cdc421cad94e5ca94: Status 404 returned error can't find the container with id d37d8af35d2ded9defeb1b2d338f01c6c8c842908b0a1b3cdc421cad94e5ca94 Apr 16 16:54:02.870375 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.870348 2572 generic.go:358] "Generic (PLEG): container finished" podID="c8245f56-6b44-4411-a8ef-ddd146797d15" containerID="31e048fc4112205ab8f1e2e413e0cf57ee29cbfc8c9c25c7843b659995e90006" exitCode=0 Apr 16 16:54:02.870450 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.870376 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" event={"ID":"c8245f56-6b44-4411-a8ef-ddd146797d15","Type":"ContainerDied","Data":"31e048fc4112205ab8f1e2e413e0cf57ee29cbfc8c9c25c7843b659995e90006"} Apr 16 16:54:02.870450 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.870406 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" event={"ID":"c8245f56-6b44-4411-a8ef-ddd146797d15","Type":"ContainerStarted","Data":"038d37d363619a411f4c553772ed435f2ce3312e48e7f8d5a9cbb7ade4d4a2c6"} Apr 16 16:54:02.871507 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.871482 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" event={"ID":"6afffe93-3a45-44b4-8377-377795c630d6","Type":"ContainerStarted","Data":"d37d8af35d2ded9defeb1b2d338f01c6c8c842908b0a1b3cdc421cad94e5ca94"} Apr 16 16:54:02.872702 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.872683 2572 generic.go:358] "Generic (PLEG): container finished" podID="dda337df-05d0-4443-984a-0cf00fe34cc4" containerID="bd312f8277dc7cb067b67bf58d16bf87e2c561f3f66c2eb8e44fcf58a570ba56" exitCode=0 Apr 16 16:54:02.872798 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.872726 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" event={"ID":"dda337df-05d0-4443-984a-0cf00fe34cc4","Type":"ContainerDied","Data":"bd312f8277dc7cb067b67bf58d16bf87e2c561f3f66c2eb8e44fcf58a570ba56"} Apr 16 16:54:02.872798 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.872751 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" event={"ID":"dda337df-05d0-4443-984a-0cf00fe34cc4","Type":"ContainerStarted","Data":"ce684f0019b9d8f82970ea5a6c2bafba08f791a56981ba832fb5c102a8e3dc2f"} Apr 16 16:54:02.874054 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.874031 2572 generic.go:358] "Generic (PLEG): container finished" podID="9eb041de-9aa8-4afe-8cc5-277411465377" containerID="c6c905c311be39df5637ae7f83cc6c3fa14a695de9eaf75d7a13ba4fcac94434" exitCode=0 Apr 16 16:54:02.874151 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.874111 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" event={"ID":"9eb041de-9aa8-4afe-8cc5-277411465377","Type":"ContainerDied","Data":"c6c905c311be39df5637ae7f83cc6c3fa14a695de9eaf75d7a13ba4fcac94434"} Apr 16 16:54:02.874151 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:02.874134 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" event={"ID":"9eb041de-9aa8-4afe-8cc5-277411465377","Type":"ContainerStarted","Data":"f7d6dffffc52f5da1dc241734a36ebac21cc87452c0c9c96520e36d0bec5d684"} Apr 16 16:54:03.879149 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:03.879123 2572 generic.go:358] "Generic (PLEG): container finished" podID="c8245f56-6b44-4411-a8ef-ddd146797d15" containerID="1ac0a641fea9a367dfc27ef7ac4e0a8d02f6ea34fbcc18c21931b9dc42ed7a1c" exitCode=0 Apr 16 16:54:03.879496 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:03.879206 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" event={"ID":"c8245f56-6b44-4411-a8ef-ddd146797d15","Type":"ContainerDied","Data":"1ac0a641fea9a367dfc27ef7ac4e0a8d02f6ea34fbcc18c21931b9dc42ed7a1c"} Apr 16 16:54:03.880505 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:03.880484 2572 generic.go:358] "Generic (PLEG): container finished" podID="6afffe93-3a45-44b4-8377-377795c630d6" containerID="cce818621b1dad6857e99955bed01f1a9c699218b26d009701957858ecbda0a7" exitCode=0 Apr 16 16:54:03.880589 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:03.880553 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" event={"ID":"6afffe93-3a45-44b4-8377-377795c630d6","Type":"ContainerDied","Data":"cce818621b1dad6857e99955bed01f1a9c699218b26d009701957858ecbda0a7"} Apr 16 16:54:03.882243 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:03.882220 2572 generic.go:358] "Generic (PLEG): container finished" podID="dda337df-05d0-4443-984a-0cf00fe34cc4" containerID="f26a18fdd1bd560f586ea14f90404ecb295c2ef351232e80d41628ee51c03c9a" exitCode=0 Apr 16 16:54:03.882330 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:03.882291 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" event={"ID":"dda337df-05d0-4443-984a-0cf00fe34cc4","Type":"ContainerDied","Data":"f26a18fdd1bd560f586ea14f90404ecb295c2ef351232e80d41628ee51c03c9a"} Apr 16 16:54:03.884128 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:03.884108 2572 generic.go:358] "Generic (PLEG): container finished" podID="9eb041de-9aa8-4afe-8cc5-277411465377" containerID="fe6d2afca898645b418e428bac7a4b0716a5ecab5fdf8b91402aae2a996ddeb1" exitCode=0 Apr 16 16:54:03.884210 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:03.884171 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" event={"ID":"9eb041de-9aa8-4afe-8cc5-277411465377","Type":"ContainerDied","Data":"fe6d2afca898645b418e428bac7a4b0716a5ecab5fdf8b91402aae2a996ddeb1"} Apr 16 16:54:04.889289 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:04.889256 2572 generic.go:358] "Generic (PLEG): container finished" podID="dda337df-05d0-4443-984a-0cf00fe34cc4" containerID="707be40d53b131126dfaeaa5cf8a7181e273593efa7cbac025e4115a30be38ac" exitCode=0 Apr 16 16:54:04.889661 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:04.889336 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" event={"ID":"dda337df-05d0-4443-984a-0cf00fe34cc4","Type":"ContainerDied","Data":"707be40d53b131126dfaeaa5cf8a7181e273593efa7cbac025e4115a30be38ac"} Apr 16 16:54:04.890998 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:04.890969 2572 generic.go:358] "Generic (PLEG): container finished" podID="9eb041de-9aa8-4afe-8cc5-277411465377" containerID="d7891df3b6a1382f4894785d65e6dad4697b60dd22f3d073dad85aeae10d8d57" exitCode=0 Apr 16 16:54:04.891116 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:04.891050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" event={"ID":"9eb041de-9aa8-4afe-8cc5-277411465377","Type":"ContainerDied","Data":"d7891df3b6a1382f4894785d65e6dad4697b60dd22f3d073dad85aeae10d8d57"} Apr 16 16:54:04.892709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:04.892688 2572 generic.go:358] "Generic (PLEG): container finished" podID="c8245f56-6b44-4411-a8ef-ddd146797d15" containerID="949638df3743c5e7b600684151533a8848e15bcac32888941531457f09751182" exitCode=0 Apr 16 16:54:04.892788 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:04.892774 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" event={"ID":"c8245f56-6b44-4411-a8ef-ddd146797d15","Type":"ContainerDied","Data":"949638df3743c5e7b600684151533a8848e15bcac32888941531457f09751182"} Apr 16 16:54:04.894186 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:04.894167 2572 generic.go:358] "Generic (PLEG): container finished" podID="6afffe93-3a45-44b4-8377-377795c630d6" containerID="c74e10dabd265399d06300b3827178b8210931251ba875033c4cc3fe3b16f7fd" exitCode=0 Apr 16 16:54:04.894279 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:04.894201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" event={"ID":"6afffe93-3a45-44b4-8377-377795c630d6","Type":"ContainerDied","Data":"c74e10dabd265399d06300b3827178b8210931251ba875033c4cc3fe3b16f7fd"} Apr 16 16:54:05.899569 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:05.899529 2572 generic.go:358] "Generic (PLEG): container finished" podID="6afffe93-3a45-44b4-8377-377795c630d6" containerID="ea53a0f8b08f7273e12f1851c2b09f2391e567be8982a834bda83814dd2d3317" exitCode=0 Apr 16 16:54:05.899938 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:05.899608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" event={"ID":"6afffe93-3a45-44b4-8377-377795c630d6","Type":"ContainerDied","Data":"ea53a0f8b08f7273e12f1851c2b09f2391e567be8982a834bda83814dd2d3317"} Apr 16 16:54:06.033759 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.033737 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:06.074410 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.074389 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-util\") pod \"c8245f56-6b44-4411-a8ef-ddd146797d15\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " Apr 16 16:54:06.074522 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.074487 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpmcz\" (UniqueName: \"kubernetes.io/projected/c8245f56-6b44-4411-a8ef-ddd146797d15-kube-api-access-bpmcz\") pod \"c8245f56-6b44-4411-a8ef-ddd146797d15\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " Apr 16 16:54:06.074577 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.074534 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-bundle\") pod \"c8245f56-6b44-4411-a8ef-ddd146797d15\" (UID: \"c8245f56-6b44-4411-a8ef-ddd146797d15\") " Apr 16 16:54:06.075169 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.074999 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-bundle" (OuterVolumeSpecName: "bundle") pod "c8245f56-6b44-4411-a8ef-ddd146797d15" (UID: "c8245f56-6b44-4411-a8ef-ddd146797d15"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:06.076571 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.076551 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:06.076900 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.076881 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8245f56-6b44-4411-a8ef-ddd146797d15-kube-api-access-bpmcz" (OuterVolumeSpecName: "kube-api-access-bpmcz") pod "c8245f56-6b44-4411-a8ef-ddd146797d15" (UID: "c8245f56-6b44-4411-a8ef-ddd146797d15"). InnerVolumeSpecName "kube-api-access-bpmcz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:54:06.080131 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.080115 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:06.080669 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.080650 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-util" (OuterVolumeSpecName: "util") pod "c8245f56-6b44-4411-a8ef-ddd146797d15" (UID: "c8245f56-6b44-4411-a8ef-ddd146797d15"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:06.175868 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.175799 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6n46\" (UniqueName: \"kubernetes.io/projected/9eb041de-9aa8-4afe-8cc5-277411465377-kube-api-access-h6n46\") pod \"9eb041de-9aa8-4afe-8cc5-277411465377\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " Apr 16 16:54:06.175868 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.175845 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-util\") pod \"9eb041de-9aa8-4afe-8cc5-277411465377\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " Apr 16 16:54:06.176107 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.175875 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8x9z\" (UniqueName: \"kubernetes.io/projected/dda337df-05d0-4443-984a-0cf00fe34cc4-kube-api-access-j8x9z\") pod \"dda337df-05d0-4443-984a-0cf00fe34cc4\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " Apr 16 16:54:06.176107 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.175913 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-bundle\") pod \"9eb041de-9aa8-4afe-8cc5-277411465377\" (UID: \"9eb041de-9aa8-4afe-8cc5-277411465377\") " Apr 16 16:54:06.176107 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.175929 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-bundle\") pod \"dda337df-05d0-4443-984a-0cf00fe34cc4\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " Apr 16 16:54:06.176107 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.175953 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-util\") pod \"dda337df-05d0-4443-984a-0cf00fe34cc4\" (UID: \"dda337df-05d0-4443-984a-0cf00fe34cc4\") " Apr 16 16:54:06.176303 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.176131 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-bundle\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:06.176303 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.176148 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8245f56-6b44-4411-a8ef-ddd146797d15-util\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:06.176303 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.176162 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpmcz\" (UniqueName: \"kubernetes.io/projected/c8245f56-6b44-4411-a8ef-ddd146797d15-kube-api-access-bpmcz\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:06.176456 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.176429 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-bundle" (OuterVolumeSpecName: "bundle") pod "dda337df-05d0-4443-984a-0cf00fe34cc4" (UID: "dda337df-05d0-4443-984a-0cf00fe34cc4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:06.176648 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.176619 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-bundle" (OuterVolumeSpecName: "bundle") pod "9eb041de-9aa8-4afe-8cc5-277411465377" (UID: "9eb041de-9aa8-4afe-8cc5-277411465377"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:06.177985 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.177964 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda337df-05d0-4443-984a-0cf00fe34cc4-kube-api-access-j8x9z" (OuterVolumeSpecName: "kube-api-access-j8x9z") pod "dda337df-05d0-4443-984a-0cf00fe34cc4" (UID: "dda337df-05d0-4443-984a-0cf00fe34cc4"). InnerVolumeSpecName "kube-api-access-j8x9z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:54:06.178318 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.178291 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb041de-9aa8-4afe-8cc5-277411465377-kube-api-access-h6n46" (OuterVolumeSpecName: "kube-api-access-h6n46") pod "9eb041de-9aa8-4afe-8cc5-277411465377" (UID: "9eb041de-9aa8-4afe-8cc5-277411465377"). InnerVolumeSpecName "kube-api-access-h6n46". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:54:06.181257 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.181239 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-util" (OuterVolumeSpecName: "util") pod "9eb041de-9aa8-4afe-8cc5-277411465377" (UID: "9eb041de-9aa8-4afe-8cc5-277411465377"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:06.181620 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.181600 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-util" (OuterVolumeSpecName: "util") pod "dda337df-05d0-4443-984a-0cf00fe34cc4" (UID: "dda337df-05d0-4443-984a-0cf00fe34cc4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:06.276867 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.276845 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-util\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:06.276867 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.276867 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8x9z\" (UniqueName: \"kubernetes.io/projected/dda337df-05d0-4443-984a-0cf00fe34cc4-kube-api-access-j8x9z\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:06.276977 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.276877 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9eb041de-9aa8-4afe-8cc5-277411465377-bundle\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:06.276977 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.276886 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-bundle\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:06.276977 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.276894 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda337df-05d0-4443-984a-0cf00fe34cc4-util\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:06.276977 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.276902 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h6n46\" (UniqueName: \"kubernetes.io/projected/9eb041de-9aa8-4afe-8cc5-277411465377-kube-api-access-h6n46\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:06.904991 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.904957 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" Apr 16 16:54:06.905456 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.904957 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bs7kk4" event={"ID":"c8245f56-6b44-4411-a8ef-ddd146797d15","Type":"ContainerDied","Data":"038d37d363619a411f4c553772ed435f2ce3312e48e7f8d5a9cbb7ade4d4a2c6"} Apr 16 16:54:06.905456 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.905092 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="038d37d363619a411f4c553772ed435f2ce3312e48e7f8d5a9cbb7ade4d4a2c6" Apr 16 16:54:06.906626 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.906605 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" event={"ID":"dda337df-05d0-4443-984a-0cf00fe34cc4","Type":"ContainerDied","Data":"ce684f0019b9d8f82970ea5a6c2bafba08f791a56981ba832fb5c102a8e3dc2f"} Apr 16 16:54:06.906728 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.906628 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bbd6t" Apr 16 16:54:06.906728 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.906628 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce684f0019b9d8f82970ea5a6c2bafba08f791a56981ba832fb5c102a8e3dc2f" Apr 16 16:54:06.908277 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.908256 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" Apr 16 16:54:06.908277 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.908270 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pxvsv" event={"ID":"9eb041de-9aa8-4afe-8cc5-277411465377","Type":"ContainerDied","Data":"f7d6dffffc52f5da1dc241734a36ebac21cc87452c0c9c96520e36d0bec5d684"} Apr 16 16:54:06.908435 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:06.908292 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7d6dffffc52f5da1dc241734a36ebac21cc87452c0c9c96520e36d0bec5d684" Apr 16 16:54:07.025912 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.025890 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:07.082257 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.082232 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-util\") pod \"6afffe93-3a45-44b4-8377-377795c630d6\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " Apr 16 16:54:07.082408 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.082277 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w46l8\" (UniqueName: \"kubernetes.io/projected/6afffe93-3a45-44b4-8377-377795c630d6-kube-api-access-w46l8\") pod \"6afffe93-3a45-44b4-8377-377795c630d6\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " Apr 16 16:54:07.082408 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.082323 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-bundle\") pod \"6afffe93-3a45-44b4-8377-377795c630d6\" (UID: \"6afffe93-3a45-44b4-8377-377795c630d6\") " Apr 16 16:54:07.082877 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.082851 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-bundle" (OuterVolumeSpecName: "bundle") pod "6afffe93-3a45-44b4-8377-377795c630d6" (UID: "6afffe93-3a45-44b4-8377-377795c630d6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:07.084435 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.084411 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6afffe93-3a45-44b4-8377-377795c630d6-kube-api-access-w46l8" (OuterVolumeSpecName: "kube-api-access-w46l8") pod "6afffe93-3a45-44b4-8377-377795c630d6" (UID: "6afffe93-3a45-44b4-8377-377795c630d6"). InnerVolumeSpecName "kube-api-access-w46l8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:54:07.087370 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.087332 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-util" (OuterVolumeSpecName: "util") pod "6afffe93-3a45-44b4-8377-377795c630d6" (UID: "6afffe93-3a45-44b4-8377-377795c630d6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:07.182960 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.182905 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-bundle\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:07.182960 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.182928 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afffe93-3a45-44b4-8377-377795c630d6-util\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:07.182960 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.182937 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w46l8\" (UniqueName: \"kubernetes.io/projected/6afffe93-3a45-44b4-8377-377795c630d6-kube-api-access-w46l8\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:54:07.913937 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.913895 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" event={"ID":"6afffe93-3a45-44b4-8377-377795c630d6","Type":"ContainerDied","Data":"d37d8af35d2ded9defeb1b2d338f01c6c8c842908b0a1b3cdc421cad94e5ca94"} Apr 16 16:54:07.913937 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.913929 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d37d8af35d2ded9defeb1b2d338f01c6c8c842908b0a1b3cdc421cad94e5ca94" Apr 16 16:54:07.913937 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:07.913902 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c3044lkl" Apr 16 16:54:13.085401 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085368 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c"] Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085698 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9eb041de-9aa8-4afe-8cc5-277411465377" containerName="extract" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085709 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb041de-9aa8-4afe-8cc5-277411465377" containerName="extract" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085717 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dda337df-05d0-4443-984a-0cf00fe34cc4" containerName="extract" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085722 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda337df-05d0-4443-984a-0cf00fe34cc4" containerName="extract" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085729 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6afffe93-3a45-44b4-8377-377795c630d6" containerName="pull" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085734 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afffe93-3a45-44b4-8377-377795c630d6" containerName="pull" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085740 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6afffe93-3a45-44b4-8377-377795c630d6" containerName="extract" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085745 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afffe93-3a45-44b4-8377-377795c630d6" containerName="extract" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085750 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8245f56-6b44-4411-a8ef-ddd146797d15" containerName="pull" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085755 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8245f56-6b44-4411-a8ef-ddd146797d15" containerName="pull" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085764 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9eb041de-9aa8-4afe-8cc5-277411465377" containerName="pull" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085769 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb041de-9aa8-4afe-8cc5-277411465377" containerName="pull" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085776 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6afffe93-3a45-44b4-8377-377795c630d6" containerName="util" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085781 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afffe93-3a45-44b4-8377-377795c630d6" containerName="util" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085787 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8245f56-6b44-4411-a8ef-ddd146797d15" containerName="util" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085792 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8245f56-6b44-4411-a8ef-ddd146797d15" containerName="util" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085797 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9eb041de-9aa8-4afe-8cc5-277411465377" containerName="util" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085803 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb041de-9aa8-4afe-8cc5-277411465377" containerName="util" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085809 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8245f56-6b44-4411-a8ef-ddd146797d15" containerName="extract" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085813 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8245f56-6b44-4411-a8ef-ddd146797d15" containerName="extract" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085822 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dda337df-05d0-4443-984a-0cf00fe34cc4" containerName="util" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085826 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda337df-05d0-4443-984a-0cf00fe34cc4" containerName="util" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085832 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dda337df-05d0-4443-984a-0cf00fe34cc4" containerName="pull" Apr 16 16:54:13.085845 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085836 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda337df-05d0-4443-984a-0cf00fe34cc4" containerName="pull" Apr 16 16:54:13.086564 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085885 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9eb041de-9aa8-4afe-8cc5-277411465377" containerName="extract" Apr 16 16:54:13.086564 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085894 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8245f56-6b44-4411-a8ef-ddd146797d15" containerName="extract" Apr 16 16:54:13.086564 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085900 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6afffe93-3a45-44b4-8377-377795c630d6" containerName="extract" Apr 16 16:54:13.086564 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.085907 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="dda337df-05d0-4443-984a-0cf00fe34cc4" containerName="extract" Apr 16 16:54:13.092159 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.092140 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c" Apr 16 16:54:13.095226 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.095203 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 16:54:13.095336 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.095203 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-28s26\"" Apr 16 16:54:13.095336 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.095254 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 16:54:13.095413 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.095255 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 16:54:13.100612 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.100589 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c"] Apr 16 16:54:13.129492 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.129460 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxqb9\" (UniqueName: \"kubernetes.io/projected/97b063e2-ae46-489b-b95c-bfa8856292f0-kube-api-access-gxqb9\") pod \"dns-operator-controller-manager-844548ff4c-6pk8c\" (UID: \"97b063e2-ae46-489b-b95c-bfa8856292f0\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c" Apr 16 16:54:13.230485 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.230445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxqb9\" (UniqueName: \"kubernetes.io/projected/97b063e2-ae46-489b-b95c-bfa8856292f0-kube-api-access-gxqb9\") pod \"dns-operator-controller-manager-844548ff4c-6pk8c\" (UID: \"97b063e2-ae46-489b-b95c-bfa8856292f0\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c" Apr 16 16:54:13.247657 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.247630 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxqb9\" (UniqueName: \"kubernetes.io/projected/97b063e2-ae46-489b-b95c-bfa8856292f0-kube-api-access-gxqb9\") pod \"dns-operator-controller-manager-844548ff4c-6pk8c\" (UID: \"97b063e2-ae46-489b-b95c-bfa8856292f0\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c" Apr 16 16:54:13.402958 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.402924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c" Apr 16 16:54:13.529367 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.529344 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c"] Apr 16 16:54:13.530886 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:54:13.530865 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b063e2_ae46_489b_b95c_bfa8856292f0.slice/crio-d767ef31e377677fed466145cc4cfddf03cc89bbf6b111d51db2083263a83058 WatchSource:0}: Error finding container d767ef31e377677fed466145cc4cfddf03cc89bbf6b111d51db2083263a83058: Status 404 returned error can't find the container with id d767ef31e377677fed466145cc4cfddf03cc89bbf6b111d51db2083263a83058 Apr 16 16:54:13.940324 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:13.940292 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c" event={"ID":"97b063e2-ae46-489b-b95c-bfa8856292f0","Type":"ContainerStarted","Data":"d767ef31e377677fed466145cc4cfddf03cc89bbf6b111d51db2083263a83058"} Apr 16 16:54:15.697346 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:15.697312 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-qrvbs"] Apr 16 16:54:15.700966 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:15.700941 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-qrvbs" Apr 16 16:54:15.703769 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:15.703748 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-mdwz5\"" Apr 16 16:54:15.710448 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:15.710423 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-qrvbs"] Apr 16 16:54:15.755557 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:15.755526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-865mm\" (UniqueName: \"kubernetes.io/projected/4ab47623-d87a-4da7-bd7b-aa5916314031-kube-api-access-865mm\") pod \"authorino-operator-7587b89b76-qrvbs\" (UID: \"4ab47623-d87a-4da7-bd7b-aa5916314031\") " pod="kuadrant-system/authorino-operator-7587b89b76-qrvbs" Apr 16 16:54:15.856467 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:15.856429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-865mm\" (UniqueName: \"kubernetes.io/projected/4ab47623-d87a-4da7-bd7b-aa5916314031-kube-api-access-865mm\") pod \"authorino-operator-7587b89b76-qrvbs\" (UID: \"4ab47623-d87a-4da7-bd7b-aa5916314031\") " pod="kuadrant-system/authorino-operator-7587b89b76-qrvbs" Apr 16 16:54:15.864838 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:15.864805 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-865mm\" (UniqueName: \"kubernetes.io/projected/4ab47623-d87a-4da7-bd7b-aa5916314031-kube-api-access-865mm\") pod \"authorino-operator-7587b89b76-qrvbs\" (UID: \"4ab47623-d87a-4da7-bd7b-aa5916314031\") " pod="kuadrant-system/authorino-operator-7587b89b76-qrvbs" Apr 16 16:54:16.014442 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:16.014415 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-qrvbs" Apr 16 16:54:16.137430 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:16.137406 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-qrvbs"] Apr 16 16:54:16.139206 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:54:16.139178 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ab47623_d87a_4da7_bd7b_aa5916314031.slice/crio-fbae78e4d3c4b16262539ff2981930176bb89b82573acf893ea7fbbdcac016ae WatchSource:0}: Error finding container fbae78e4d3c4b16262539ff2981930176bb89b82573acf893ea7fbbdcac016ae: Status 404 returned error can't find the container with id fbae78e4d3c4b16262539ff2981930176bb89b82573acf893ea7fbbdcac016ae Apr 16 16:54:16.953244 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:16.953195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c" event={"ID":"97b063e2-ae46-489b-b95c-bfa8856292f0","Type":"ContainerStarted","Data":"5338bb3b8ad7ac015b2954823141eb303d7bcc91736423c20c1222f45175c404"} Apr 16 16:54:16.953664 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:16.953355 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c" Apr 16 16:54:16.954455 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:16.954430 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-qrvbs" event={"ID":"4ab47623-d87a-4da7-bd7b-aa5916314031","Type":"ContainerStarted","Data":"fbae78e4d3c4b16262539ff2981930176bb89b82573acf893ea7fbbdcac016ae"} Apr 16 16:54:16.983924 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:16.983883 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c" podStartSLOduration=1.588661037 podStartE2EDuration="3.983871647s" podCreationTimestamp="2026-04-16 16:54:13 +0000 UTC" firstStartedPulling="2026-04-16 16:54:13.532901207 +0000 UTC m=+370.701612548" lastFinishedPulling="2026-04-16 16:54:15.928111804 +0000 UTC m=+373.096823158" observedRunningTime="2026-04-16 16:54:16.981288437 +0000 UTC m=+374.149999800" watchObservedRunningTime="2026-04-16 16:54:16.983871647 +0000 UTC m=+374.152583009" Apr 16 16:54:17.959692 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:17.959603 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-qrvbs" event={"ID":"4ab47623-d87a-4da7-bd7b-aa5916314031","Type":"ContainerStarted","Data":"8a55ec137bd16905aebac864850b8e5ad95654254ff19491946a994c10dd0100"} Apr 16 16:54:17.976905 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:17.976861 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-qrvbs" podStartSLOduration=1.465451182 podStartE2EDuration="2.976848613s" podCreationTimestamp="2026-04-16 16:54:15 +0000 UTC" firstStartedPulling="2026-04-16 16:54:16.147344318 +0000 UTC m=+373.316055659" lastFinishedPulling="2026-04-16 16:54:17.658741726 +0000 UTC m=+374.827453090" observedRunningTime="2026-04-16 16:54:17.975614509 +0000 UTC m=+375.144325869" watchObservedRunningTime="2026-04-16 16:54:17.976848613 +0000 UTC m=+375.145559975" Apr 16 16:54:18.963071 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:18.963034 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-qrvbs" Apr 16 16:54:22.608030 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:22.607995 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg"] Apr 16 16:54:22.615310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:22.615294 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" Apr 16 16:54:22.618408 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:22.618389 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-qqdd7\"" Apr 16 16:54:22.628139 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:22.628119 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg"] Apr 16 16:54:22.715537 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:22.715511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz95g\" (UniqueName: \"kubernetes.io/projected/b37eedbd-8438-4d03-a544-50830b57acf4-kube-api-access-pz95g\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jn6hg\" (UID: \"b37eedbd-8438-4d03-a544-50830b57acf4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" Apr 16 16:54:22.715674 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:22.715543 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b37eedbd-8438-4d03-a544-50830b57acf4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jn6hg\" (UID: \"b37eedbd-8438-4d03-a544-50830b57acf4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" Apr 16 16:54:22.816332 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:22.816298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b37eedbd-8438-4d03-a544-50830b57acf4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jn6hg\" (UID: \"b37eedbd-8438-4d03-a544-50830b57acf4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" Apr 16 16:54:22.816475 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:22.816407 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pz95g\" (UniqueName: \"kubernetes.io/projected/b37eedbd-8438-4d03-a544-50830b57acf4-kube-api-access-pz95g\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jn6hg\" (UID: \"b37eedbd-8438-4d03-a544-50830b57acf4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" Apr 16 16:54:22.816651 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:22.816631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b37eedbd-8438-4d03-a544-50830b57acf4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jn6hg\" (UID: \"b37eedbd-8438-4d03-a544-50830b57acf4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" Apr 16 16:54:22.825030 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:22.824999 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz95g\" (UniqueName: \"kubernetes.io/projected/b37eedbd-8438-4d03-a544-50830b57acf4-kube-api-access-pz95g\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jn6hg\" (UID: \"b37eedbd-8438-4d03-a544-50830b57acf4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" Apr 16 16:54:22.925080 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:22.925040 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" Apr 16 16:54:23.068982 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:23.068954 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg"] Apr 16 16:54:23.071003 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:54:23.070968 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37eedbd_8438_4d03_a544_50830b57acf4.slice/crio-c516ac68135a68fedac1245348207b18fa290bbedc015bef5d557c6b364522f8 WatchSource:0}: Error finding container c516ac68135a68fedac1245348207b18fa290bbedc015bef5d557c6b364522f8: Status 404 returned error can't find the container with id c516ac68135a68fedac1245348207b18fa290bbedc015bef5d557c6b364522f8 Apr 16 16:54:23.984721 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:23.984680 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" event={"ID":"b37eedbd-8438-4d03-a544-50830b57acf4","Type":"ContainerStarted","Data":"c516ac68135a68fedac1245348207b18fa290bbedc015bef5d557c6b364522f8"} Apr 16 16:54:27.961829 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:27.961799 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6pk8c" Apr 16 16:54:28.007849 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:28.007811 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" event={"ID":"b37eedbd-8438-4d03-a544-50830b57acf4","Type":"ContainerStarted","Data":"eeb347a7ea4ffec63b59e66d4c7887d152e9917a5ef3e4ae0214eba040c00882"} Apr 16 16:54:28.007980 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:28.007956 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" Apr 16 16:54:28.030242 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:28.030162 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" podStartSLOduration=1.6770958089999999 podStartE2EDuration="6.030148606s" podCreationTimestamp="2026-04-16 16:54:22 +0000 UTC" firstStartedPulling="2026-04-16 16:54:23.073186145 +0000 UTC m=+380.241897486" lastFinishedPulling="2026-04-16 16:54:27.426238938 +0000 UTC m=+384.594950283" observedRunningTime="2026-04-16 16:54:28.029102472 +0000 UTC m=+385.197813840" watchObservedRunningTime="2026-04-16 16:54:28.030148606 +0000 UTC m=+385.198859968" Apr 16 16:54:29.968935 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:29.968908 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-qrvbs" Apr 16 16:54:39.012935 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:39.012905 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jn6hg" Apr 16 16:54:43.596482 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:54:43.596445 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c4ff98cb8-l9wgn"] Apr 16 16:55:08.621014 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.620947 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c4ff98cb8-l9wgn" podUID="9a328696-e46f-4e4f-9ae0-416fa37377cd" containerName="console" containerID="cri-o://5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78" gracePeriod=15 Apr 16 16:55:08.861455 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.861435 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c4ff98cb8-l9wgn_9a328696-e46f-4e4f-9ae0-416fa37377cd/console/0.log" Apr 16 16:55:08.861561 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.861492 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:55:08.958753 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.958690 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-service-ca\") pod \"9a328696-e46f-4e4f-9ae0-416fa37377cd\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " Apr 16 16:55:08.958753 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.958737 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-config\") pod \"9a328696-e46f-4e4f-9ae0-416fa37377cd\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " Apr 16 16:55:08.958753 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.958753 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-oauth-serving-cert\") pod \"9a328696-e46f-4e4f-9ae0-416fa37377cd\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " Apr 16 16:55:08.958979 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.958780 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-oauth-config\") pod \"9a328696-e46f-4e4f-9ae0-416fa37377cd\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " Apr 16 16:55:08.958979 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.958805 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-serving-cert\") pod \"9a328696-e46f-4e4f-9ae0-416fa37377cd\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " Apr 16 16:55:08.958979 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.958830 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-trusted-ca-bundle\") pod \"9a328696-e46f-4e4f-9ae0-416fa37377cd\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " Apr 16 16:55:08.959224 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.959199 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-service-ca" (OuterVolumeSpecName: "service-ca") pod "9a328696-e46f-4e4f-9ae0-416fa37377cd" (UID: "9a328696-e46f-4e4f-9ae0-416fa37377cd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:55:08.959314 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.959212 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9a328696-e46f-4e4f-9ae0-416fa37377cd" (UID: "9a328696-e46f-4e4f-9ae0-416fa37377cd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:55:08.959314 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.959247 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9a328696-e46f-4e4f-9ae0-416fa37377cd" (UID: "9a328696-e46f-4e4f-9ae0-416fa37377cd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:55:08.959314 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.959257 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-config" (OuterVolumeSpecName: "console-config") pod "9a328696-e46f-4e4f-9ae0-416fa37377cd" (UID: "9a328696-e46f-4e4f-9ae0-416fa37377cd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:55:08.960840 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.960817 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9a328696-e46f-4e4f-9ae0-416fa37377cd" (UID: "9a328696-e46f-4e4f-9ae0-416fa37377cd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:55:08.960840 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:08.960835 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9a328696-e46f-4e4f-9ae0-416fa37377cd" (UID: "9a328696-e46f-4e4f-9ae0-416fa37377cd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:55:09.059234 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.059211 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mthx8\" (UniqueName: \"kubernetes.io/projected/9a328696-e46f-4e4f-9ae0-416fa37377cd-kube-api-access-mthx8\") pod \"9a328696-e46f-4e4f-9ae0-416fa37377cd\" (UID: \"9a328696-e46f-4e4f-9ae0-416fa37377cd\") " Apr 16 16:55:09.059374 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.059363 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-service-ca\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:55:09.059415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.059377 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-config\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:55:09.059415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.059386 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-oauth-serving-cert\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:55:09.059415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.059395 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-oauth-config\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:55:09.059415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.059404 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a328696-e46f-4e4f-9ae0-416fa37377cd-console-serving-cert\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:55:09.059415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.059413 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a328696-e46f-4e4f-9ae0-416fa37377cd-trusted-ca-bundle\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:55:09.061019 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.060997 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a328696-e46f-4e4f-9ae0-416fa37377cd-kube-api-access-mthx8" (OuterVolumeSpecName: "kube-api-access-mthx8") pod "9a328696-e46f-4e4f-9ae0-416fa37377cd" (UID: "9a328696-e46f-4e4f-9ae0-416fa37377cd"). InnerVolumeSpecName "kube-api-access-mthx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:55:09.159665 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.159645 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mthx8\" (UniqueName: \"kubernetes.io/projected/9a328696-e46f-4e4f-9ae0-416fa37377cd-kube-api-access-mthx8\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:55:09.161344 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.161327 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c4ff98cb8-l9wgn_9a328696-e46f-4e4f-9ae0-416fa37377cd/console/0.log" Apr 16 16:55:09.161440 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.161368 2572 generic.go:358] "Generic (PLEG): container finished" podID="9a328696-e46f-4e4f-9ae0-416fa37377cd" containerID="5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78" exitCode=2 Apr 16 16:55:09.161440 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.161424 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c4ff98cb8-l9wgn" Apr 16 16:55:09.161524 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.161459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c4ff98cb8-l9wgn" event={"ID":"9a328696-e46f-4e4f-9ae0-416fa37377cd","Type":"ContainerDied","Data":"5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78"} Apr 16 16:55:09.161524 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.161496 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c4ff98cb8-l9wgn" event={"ID":"9a328696-e46f-4e4f-9ae0-416fa37377cd","Type":"ContainerDied","Data":"06d9e22f8ea6f9d638993b9ad76165dca3fd3a3158ccb2db21f1e3c113811d1f"} Apr 16 16:55:09.161524 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.161511 2572 scope.go:117] "RemoveContainer" containerID="5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78" Apr 16 16:55:09.176272 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.176253 2572 scope.go:117] "RemoveContainer" containerID="5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78" Apr 16 16:55:09.176624 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:55:09.176588 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78\": container with ID starting with 5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78 not found: ID does not exist" containerID="5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78" Apr 16 16:55:09.176677 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.176644 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78"} err="failed to get container status \"5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78\": rpc error: code = NotFound desc = could not find container \"5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78\": container with ID starting with 5c433262eba882bef2fef72e1de270a81e70d61f5d62109ae210e5f2a38c5f78 not found: ID does not exist" Apr 16 16:55:09.187256 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.187235 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c4ff98cb8-l9wgn"] Apr 16 16:55:09.190735 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.190708 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c4ff98cb8-l9wgn"] Apr 16 16:55:09.590331 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:09.590285 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a328696-e46f-4e4f-9ae0-416fa37377cd" path="/var/lib/kubelet/pods/9a328696-e46f-4e4f-9ae0-416fa37377cd/volumes" Apr 16 16:55:13.165685 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.165646 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lbws6"] Apr 16 16:55:13.166172 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.166145 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a328696-e46f-4e4f-9ae0-416fa37377cd" containerName="console" Apr 16 16:55:13.166172 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.166163 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a328696-e46f-4e4f-9ae0-416fa37377cd" containerName="console" Apr 16 16:55:13.166309 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.166238 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a328696-e46f-4e4f-9ae0-416fa37377cd" containerName="console" Apr 16 16:55:13.170719 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.170699 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" Apr 16 16:55:13.173489 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.173470 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 16:55:13.173615 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.173595 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-rzzrl\"" Apr 16 16:55:13.178723 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.178699 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lbws6"] Apr 16 16:55:13.190519 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.190494 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lbws6"] Apr 16 16:55:13.191516 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.191495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fab44b43-0ba8-4c44-8c2f-1cfc3e92a166-config-file\") pod \"limitador-limitador-67566c68b4-lbws6\" (UID: \"fab44b43-0ba8-4c44-8c2f-1cfc3e92a166\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" Apr 16 16:55:13.191593 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.191523 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctpfg\" (UniqueName: \"kubernetes.io/projected/fab44b43-0ba8-4c44-8c2f-1cfc3e92a166-kube-api-access-ctpfg\") pod \"limitador-limitador-67566c68b4-lbws6\" (UID: \"fab44b43-0ba8-4c44-8c2f-1cfc3e92a166\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" Apr 16 16:55:13.292874 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.292833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fab44b43-0ba8-4c44-8c2f-1cfc3e92a166-config-file\") pod \"limitador-limitador-67566c68b4-lbws6\" (UID: \"fab44b43-0ba8-4c44-8c2f-1cfc3e92a166\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" Apr 16 16:55:13.292874 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.292878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctpfg\" (UniqueName: \"kubernetes.io/projected/fab44b43-0ba8-4c44-8c2f-1cfc3e92a166-kube-api-access-ctpfg\") pod \"limitador-limitador-67566c68b4-lbws6\" (UID: \"fab44b43-0ba8-4c44-8c2f-1cfc3e92a166\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" Apr 16 16:55:13.293470 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.293451 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fab44b43-0ba8-4c44-8c2f-1cfc3e92a166-config-file\") pod \"limitador-limitador-67566c68b4-lbws6\" (UID: \"fab44b43-0ba8-4c44-8c2f-1cfc3e92a166\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" Apr 16 16:55:13.301621 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.301600 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctpfg\" (UniqueName: \"kubernetes.io/projected/fab44b43-0ba8-4c44-8c2f-1cfc3e92a166-kube-api-access-ctpfg\") pod \"limitador-limitador-67566c68b4-lbws6\" (UID: \"fab44b43-0ba8-4c44-8c2f-1cfc3e92a166\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" Apr 16 16:55:13.482490 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.482421 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" Apr 16 16:55:13.610939 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:13.610912 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lbws6"] Apr 16 16:55:13.613152 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:55:13.613121 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfab44b43_0ba8_4c44_8c2f_1cfc3e92a166.slice/crio-76dd05129545775fa72e33adef18219a67a29a582e1632650cdeff16a606e9c7 WatchSource:0}: Error finding container 76dd05129545775fa72e33adef18219a67a29a582e1632650cdeff16a606e9c7: Status 404 returned error can't find the container with id 76dd05129545775fa72e33adef18219a67a29a582e1632650cdeff16a606e9c7 Apr 16 16:55:14.182094 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:14.182040 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" event={"ID":"fab44b43-0ba8-4c44-8c2f-1cfc3e92a166","Type":"ContainerStarted","Data":"76dd05129545775fa72e33adef18219a67a29a582e1632650cdeff16a606e9c7"} Apr 16 16:55:26.234032 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:26.233995 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" event={"ID":"fab44b43-0ba8-4c44-8c2f-1cfc3e92a166","Type":"ContainerStarted","Data":"8b3bbcd419debb1ec134128808dc9f41672be39fdc7a2c90435fa0dfe20d097d"} Apr 16 16:55:26.234408 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:26.234119 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" Apr 16 16:55:26.252485 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:26.252433 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" podStartSLOduration=1.004838198 podStartE2EDuration="13.252420408s" podCreationTimestamp="2026-04-16 16:55:13 +0000 UTC" firstStartedPulling="2026-04-16 16:55:13.614975077 +0000 UTC m=+430.783686434" lastFinishedPulling="2026-04-16 16:55:25.86255729 +0000 UTC m=+443.031268644" observedRunningTime="2026-04-16 16:55:26.249974044 +0000 UTC m=+443.418685406" watchObservedRunningTime="2026-04-16 16:55:26.252420408 +0000 UTC m=+443.421131815" Apr 16 16:55:37.239595 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:55:37.239562 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-lbws6" Apr 16 16:56:00.910740 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:00.910705 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9"] Apr 16 16:56:00.911228 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:00.910959 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" podUID="18cecd37-afce-493b-8a69-3b54afe5f0fd" containerName="discovery" containerID="cri-o://080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e" gracePeriod=30 Apr 16 16:56:01.156029 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.156009 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:56:01.251724 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.251634 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-token\") pod \"18cecd37-afce-493b-8a69-3b54afe5f0fd\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " Apr 16 16:56:01.252444 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.252421 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/18cecd37-afce-493b-8a69-3b54afe5f0fd-local-certs\") pod \"18cecd37-afce-493b-8a69-3b54afe5f0fd\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " Apr 16 16:56:01.252573 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.252451 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-dns-cert\") pod \"18cecd37-afce-493b-8a69-3b54afe5f0fd\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " Apr 16 16:56:01.252573 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.252476 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgrkl\" (UniqueName: \"kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-kube-api-access-bgrkl\") pod \"18cecd37-afce-493b-8a69-3b54afe5f0fd\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " Apr 16 16:56:01.252573 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.252509 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-kubeconfig\") pod \"18cecd37-afce-493b-8a69-3b54afe5f0fd\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " Apr 16 16:56:01.252573 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.252536 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-ca-configmap\") pod \"18cecd37-afce-493b-8a69-3b54afe5f0fd\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " Apr 16 16:56:01.252573 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.252552 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-cacerts\") pod \"18cecd37-afce-493b-8a69-3b54afe5f0fd\" (UID: \"18cecd37-afce-493b-8a69-3b54afe5f0fd\") " Apr 16 16:56:01.253006 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.252971 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "18cecd37-afce-493b-8a69-3b54afe5f0fd" (UID: "18cecd37-afce-493b-8a69-3b54afe5f0fd"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:56:01.254624 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.254582 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-token" (OuterVolumeSpecName: "istio-token") pod "18cecd37-afce-493b-8a69-3b54afe5f0fd" (UID: "18cecd37-afce-493b-8a69-3b54afe5f0fd"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:56:01.255251 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.255216 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-cacerts" (OuterVolumeSpecName: "cacerts") pod "18cecd37-afce-493b-8a69-3b54afe5f0fd" (UID: "18cecd37-afce-493b-8a69-3b54afe5f0fd"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:56:01.255378 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.255353 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18cecd37-afce-493b-8a69-3b54afe5f0fd-local-certs" (OuterVolumeSpecName: "local-certs") pod "18cecd37-afce-493b-8a69-3b54afe5f0fd" (UID: "18cecd37-afce-493b-8a69-3b54afe5f0fd"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:56:01.255590 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.255556 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "18cecd37-afce-493b-8a69-3b54afe5f0fd" (UID: "18cecd37-afce-493b-8a69-3b54afe5f0fd"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:56:01.255590 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.255567 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-kube-api-access-bgrkl" (OuterVolumeSpecName: "kube-api-access-bgrkl") pod "18cecd37-afce-493b-8a69-3b54afe5f0fd" (UID: "18cecd37-afce-493b-8a69-3b54afe5f0fd"). InnerVolumeSpecName "kube-api-access-bgrkl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:56:01.255590 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.255580 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "18cecd37-afce-493b-8a69-3b54afe5f0fd" (UID: "18cecd37-afce-493b-8a69-3b54afe5f0fd"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:56:01.353648 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.353619 2572 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/18cecd37-afce-493b-8a69-3b54afe5f0fd-local-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:56:01.353648 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.353649 2572 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-dns-cert\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:56:01.353850 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.353663 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bgrkl\" (UniqueName: \"kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-kube-api-access-bgrkl\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:56:01.353850 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.353675 2572 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-kubeconfig\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:56:01.353850 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.353687 2572 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-csr-ca-configmap\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:56:01.353850 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.353699 2572 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/18cecd37-afce-493b-8a69-3b54afe5f0fd-cacerts\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:56:01.353850 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.353710 2572 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/18cecd37-afce-493b-8a69-3b54afe5f0fd-istio-token\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:56:01.359665 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.359638 2572 generic.go:358] "Generic (PLEG): container finished" podID="18cecd37-afce-493b-8a69-3b54afe5f0fd" containerID="080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e" exitCode=0 Apr 16 16:56:01.359791 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.359694 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" Apr 16 16:56:01.359791 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.359737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" event={"ID":"18cecd37-afce-493b-8a69-3b54afe5f0fd","Type":"ContainerDied","Data":"080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e"} Apr 16 16:56:01.359791 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.359775 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9" event={"ID":"18cecd37-afce-493b-8a69-3b54afe5f0fd","Type":"ContainerDied","Data":"a3b5049dec6096ace8b83a177558b2f46ac034465875de20266906a47c84aff6"} Apr 16 16:56:01.359791 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.359791 2572 scope.go:117] "RemoveContainer" containerID="080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e" Apr 16 16:56:01.369305 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.369290 2572 scope.go:117] "RemoveContainer" containerID="080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e" Apr 16 16:56:01.369558 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:56:01.369539 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e\": container with ID starting with 080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e not found: ID does not exist" containerID="080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e" Apr 16 16:56:01.369618 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.369569 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e"} err="failed to get container status \"080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e\": rpc error: code = NotFound desc = could not find container \"080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e\": container with ID starting with 080ba290e4aaf1dc0deab2af73a13ab517f8fd4788f592779973c39d245f8a2e not found: ID does not exist" Apr 16 16:56:01.385798 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.385770 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9"] Apr 16 16:56:01.389465 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.389443 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wcfq9"] Apr 16 16:56:01.590633 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:01.590550 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18cecd37-afce-493b-8a69-3b54afe5f0fd" path="/var/lib/kubelet/pods/18cecd37-afce-493b-8a69-3b54afe5f0fd/volumes" Apr 16 16:56:05.994038 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:05.994006 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6986579898-c249f"] Apr 16 16:56:05.994496 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:05.994481 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18cecd37-afce-493b-8a69-3b54afe5f0fd" containerName="discovery" Apr 16 16:56:05.994539 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:05.994499 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="18cecd37-afce-493b-8a69-3b54afe5f0fd" containerName="discovery" Apr 16 16:56:05.994591 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:05.994581 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="18cecd37-afce-493b-8a69-3b54afe5f0fd" containerName="discovery" Apr 16 16:56:05.999421 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:05.999399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:06.002789 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.002766 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 16:56:06.002943 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.002795 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:56:06.003116 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.002796 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:56:06.003198 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.002855 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-xsf24\"" Apr 16 16:56:06.004912 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.004892 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6986579898-c249f"] Apr 16 16:56:06.017695 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.017663 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr"] Apr 16 16:56:06.020828 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.020809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 16:56:06.023395 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.023366 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 16:56:06.023503 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.023482 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-x6k2x\"" Apr 16 16:56:06.031988 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.031966 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr"] Apr 16 16:56:06.042240 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.042221 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-d9spr"] Apr 16 16:56:06.045879 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.045852 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-d9spr" Apr 16 16:56:06.048508 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.048488 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 16:56:06.048604 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.048567 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9tpss\"" Apr 16 16:56:06.055184 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.055143 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-d9spr"] Apr 16 16:56:06.087491 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.087463 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2b4z\" (UniqueName: \"kubernetes.io/projected/e3185357-5f43-4a60-845b-a5173fe12a98-kube-api-access-j2b4z\") pod \"kserve-controller-manager-6986579898-c249f\" (UID: \"e3185357-5f43-4a60-845b-a5173fe12a98\") " pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:06.087669 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.087510 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3185357-5f43-4a60-845b-a5173fe12a98-cert\") pod \"kserve-controller-manager-6986579898-c249f\" (UID: \"e3185357-5f43-4a60-845b-a5173fe12a98\") " pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:06.188398 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.188358 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-cert\") pod \"llmisvc-controller-manager-7c8b759dfd-qjwzr\" (UID: \"56bec0b1-0056-4327-9ab4-ceafc5a4df9f\") " pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 16:56:06.188579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.188445 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qv4z\" (UniqueName: \"kubernetes.io/projected/ecc87e25-944f-4abf-9995-fc8d7e8252ac-kube-api-access-2qv4z\") pod \"seaweedfs-86cc847c5c-d9spr\" (UID: \"ecc87e25-944f-4abf-9995-fc8d7e8252ac\") " pod="kserve/seaweedfs-86cc847c5c-d9spr" Apr 16 16:56:06.188579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.188502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf9gz\" (UniqueName: \"kubernetes.io/projected/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-kube-api-access-qf9gz\") pod \"llmisvc-controller-manager-7c8b759dfd-qjwzr\" (UID: \"56bec0b1-0056-4327-9ab4-ceafc5a4df9f\") " pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 16:56:06.188579 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.188544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2b4z\" (UniqueName: \"kubernetes.io/projected/e3185357-5f43-4a60-845b-a5173fe12a98-kube-api-access-j2b4z\") pod \"kserve-controller-manager-6986579898-c249f\" (UID: \"e3185357-5f43-4a60-845b-a5173fe12a98\") " pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:06.188809 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.188586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3185357-5f43-4a60-845b-a5173fe12a98-cert\") pod \"kserve-controller-manager-6986579898-c249f\" (UID: \"e3185357-5f43-4a60-845b-a5173fe12a98\") " pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:06.188809 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.188619 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ecc87e25-944f-4abf-9995-fc8d7e8252ac-data\") pod \"seaweedfs-86cc847c5c-d9spr\" (UID: \"ecc87e25-944f-4abf-9995-fc8d7e8252ac\") " pod="kserve/seaweedfs-86cc847c5c-d9spr" Apr 16 16:56:06.191190 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.191165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3185357-5f43-4a60-845b-a5173fe12a98-cert\") pod \"kserve-controller-manager-6986579898-c249f\" (UID: \"e3185357-5f43-4a60-845b-a5173fe12a98\") " pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:06.197522 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.197497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2b4z\" (UniqueName: \"kubernetes.io/projected/e3185357-5f43-4a60-845b-a5173fe12a98-kube-api-access-j2b4z\") pod \"kserve-controller-manager-6986579898-c249f\" (UID: \"e3185357-5f43-4a60-845b-a5173fe12a98\") " pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:06.289424 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.289345 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qf9gz\" (UniqueName: \"kubernetes.io/projected/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-kube-api-access-qf9gz\") pod \"llmisvc-controller-manager-7c8b759dfd-qjwzr\" (UID: \"56bec0b1-0056-4327-9ab4-ceafc5a4df9f\") " pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 16:56:06.289424 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.289403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ecc87e25-944f-4abf-9995-fc8d7e8252ac-data\") pod \"seaweedfs-86cc847c5c-d9spr\" (UID: \"ecc87e25-944f-4abf-9995-fc8d7e8252ac\") " pod="kserve/seaweedfs-86cc847c5c-d9spr" Apr 16 16:56:06.289424 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.289428 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-cert\") pod \"llmisvc-controller-manager-7c8b759dfd-qjwzr\" (UID: \"56bec0b1-0056-4327-9ab4-ceafc5a4df9f\") " pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 16:56:06.289642 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.289458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qv4z\" (UniqueName: \"kubernetes.io/projected/ecc87e25-944f-4abf-9995-fc8d7e8252ac-kube-api-access-2qv4z\") pod \"seaweedfs-86cc847c5c-d9spr\" (UID: \"ecc87e25-944f-4abf-9995-fc8d7e8252ac\") " pod="kserve/seaweedfs-86cc847c5c-d9spr" Apr 16 16:56:06.289642 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:56:06.289576 2572 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 16:56:06.289716 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:56:06.289647 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-cert podName:56bec0b1-0056-4327-9ab4-ceafc5a4df9f nodeName:}" failed. No retries permitted until 2026-04-16 16:56:06.789625207 +0000 UTC m=+483.958336568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-cert") pod "llmisvc-controller-manager-7c8b759dfd-qjwzr" (UID: "56bec0b1-0056-4327-9ab4-ceafc5a4df9f") : secret "llmisvc-webhook-server-cert" not found Apr 16 16:56:06.289838 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.289819 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ecc87e25-944f-4abf-9995-fc8d7e8252ac-data\") pod \"seaweedfs-86cc847c5c-d9spr\" (UID: \"ecc87e25-944f-4abf-9995-fc8d7e8252ac\") " pod="kserve/seaweedfs-86cc847c5c-d9spr" Apr 16 16:56:06.297970 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.297948 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qv4z\" (UniqueName: \"kubernetes.io/projected/ecc87e25-944f-4abf-9995-fc8d7e8252ac-kube-api-access-2qv4z\") pod \"seaweedfs-86cc847c5c-d9spr\" (UID: \"ecc87e25-944f-4abf-9995-fc8d7e8252ac\") " pod="kserve/seaweedfs-86cc847c5c-d9spr" Apr 16 16:56:06.298443 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.298427 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf9gz\" (UniqueName: \"kubernetes.io/projected/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-kube-api-access-qf9gz\") pod \"llmisvc-controller-manager-7c8b759dfd-qjwzr\" (UID: \"56bec0b1-0056-4327-9ab4-ceafc5a4df9f\") " pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 16:56:06.312668 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.312648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:06.360766 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.360517 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-d9spr" Apr 16 16:56:06.463772 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.463245 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6986579898-c249f"] Apr 16 16:56:06.505019 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.504991 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-d9spr"] Apr 16 16:56:06.506973 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:56:06.506941 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecc87e25_944f_4abf_9995_fc8d7e8252ac.slice/crio-85fc3ce698c496efd856098a2f494523d18019261659390ba6dd37d3d35a0858 WatchSource:0}: Error finding container 85fc3ce698c496efd856098a2f494523d18019261659390ba6dd37d3d35a0858: Status 404 returned error can't find the container with id 85fc3ce698c496efd856098a2f494523d18019261659390ba6dd37d3d35a0858 Apr 16 16:56:06.794031 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.793999 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-cert\") pod \"llmisvc-controller-manager-7c8b759dfd-qjwzr\" (UID: \"56bec0b1-0056-4327-9ab4-ceafc5a4df9f\") " pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 16:56:06.796485 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.796463 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-cert\") pod \"llmisvc-controller-manager-7c8b759dfd-qjwzr\" (UID: \"56bec0b1-0056-4327-9ab4-ceafc5a4df9f\") " pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 16:56:06.933287 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:06.933254 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 16:56:07.206747 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:07.206691 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr"] Apr 16 16:56:07.213212 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:56:07.213110 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod56bec0b1_0056_4327_9ab4_ceafc5a4df9f.slice/crio-445a98decf9a3e393b2d28501d663daac3f1b2372d403dcbc058d2e6e3c63f27 WatchSource:0}: Error finding container 445a98decf9a3e393b2d28501d663daac3f1b2372d403dcbc058d2e6e3c63f27: Status 404 returned error can't find the container with id 445a98decf9a3e393b2d28501d663daac3f1b2372d403dcbc058d2e6e3c63f27 Apr 16 16:56:07.386938 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:07.386874 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-d9spr" event={"ID":"ecc87e25-944f-4abf-9995-fc8d7e8252ac","Type":"ContainerStarted","Data":"85fc3ce698c496efd856098a2f494523d18019261659390ba6dd37d3d35a0858"} Apr 16 16:56:07.388775 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:07.388732 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-c249f" event={"ID":"e3185357-5f43-4a60-845b-a5173fe12a98","Type":"ContainerStarted","Data":"9e457b798eb729da1a949be9f97782d6c45515fcd9b06474c3e06e7facb0b396"} Apr 16 16:56:07.391033 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:07.390973 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" event={"ID":"56bec0b1-0056-4327-9ab4-ceafc5a4df9f","Type":"ContainerStarted","Data":"445a98decf9a3e393b2d28501d663daac3f1b2372d403dcbc058d2e6e3c63f27"} Apr 16 16:56:10.405178 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:10.405151 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-c249f" event={"ID":"e3185357-5f43-4a60-845b-a5173fe12a98","Type":"ContainerStarted","Data":"b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55"} Apr 16 16:56:10.405489 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:10.405205 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:10.423945 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:10.423902 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6986579898-c249f" podStartSLOduration=1.983687734 podStartE2EDuration="5.423887495s" podCreationTimestamp="2026-04-16 16:56:05 +0000 UTC" firstStartedPulling="2026-04-16 16:56:06.466976269 +0000 UTC m=+483.635687623" lastFinishedPulling="2026-04-16 16:56:09.907176043 +0000 UTC m=+487.075887384" observedRunningTime="2026-04-16 16:56:10.421092071 +0000 UTC m=+487.589803432" watchObservedRunningTime="2026-04-16 16:56:10.423887495 +0000 UTC m=+487.592598858" Apr 16 16:56:11.410123 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:11.410052 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-d9spr" event={"ID":"ecc87e25-944f-4abf-9995-fc8d7e8252ac","Type":"ContainerStarted","Data":"89dc26820703a99faf76d4c93a6818f754ba190d950a5226e9d77cfeca79d802"} Apr 16 16:56:11.410650 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:11.410132 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-d9spr" Apr 16 16:56:11.426605 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:11.426564 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-d9spr" podStartSLOduration=1.875617718 podStartE2EDuration="5.426552637s" podCreationTimestamp="2026-04-16 16:56:06 +0000 UTC" firstStartedPulling="2026-04-16 16:56:06.50860632 +0000 UTC m=+483.677317661" lastFinishedPulling="2026-04-16 16:56:10.059541236 +0000 UTC m=+487.228252580" observedRunningTime="2026-04-16 16:56:11.424329917 +0000 UTC m=+488.593041278" watchObservedRunningTime="2026-04-16 16:56:11.426552637 +0000 UTC m=+488.595263999" Apr 16 16:56:13.418271 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:13.418234 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" event={"ID":"56bec0b1-0056-4327-9ab4-ceafc5a4df9f","Type":"ContainerStarted","Data":"ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d"} Apr 16 16:56:13.418708 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:13.418347 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 16:56:13.434879 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:13.434837 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" podStartSLOduration=2.561364786 podStartE2EDuration="8.434824325s" podCreationTimestamp="2026-04-16 16:56:05 +0000 UTC" firstStartedPulling="2026-04-16 16:56:07.217279771 +0000 UTC m=+484.385991115" lastFinishedPulling="2026-04-16 16:56:13.090739312 +0000 UTC m=+490.259450654" observedRunningTime="2026-04-16 16:56:13.433963623 +0000 UTC m=+490.602674987" watchObservedRunningTime="2026-04-16 16:56:13.434824325 +0000 UTC m=+490.603535687" Apr 16 16:56:17.415813 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:17.415786 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-d9spr" Apr 16 16:56:41.414867 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:41.414835 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:44.424910 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:44.424879 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 16:56:45.617619 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.617584 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6986579898-c249f"] Apr 16 16:56:45.618052 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.617797 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6986579898-c249f" podUID="e3185357-5f43-4a60-845b-a5173fe12a98" containerName="manager" containerID="cri-o://b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55" gracePeriod=10 Apr 16 16:56:45.641710 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.641687 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6986579898-n5mnl"] Apr 16 16:56:45.644965 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.644950 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-n5mnl" Apr 16 16:56:45.655114 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.655083 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6986579898-n5mnl"] Apr 16 16:56:45.692169 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.692136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32c8d391-524f-4885-afb2-f374af1adf47-cert\") pod \"kserve-controller-manager-6986579898-n5mnl\" (UID: \"32c8d391-524f-4885-afb2-f374af1adf47\") " pod="kserve/kserve-controller-manager-6986579898-n5mnl" Apr 16 16:56:45.692294 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.692186 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49bk\" (UniqueName: \"kubernetes.io/projected/32c8d391-524f-4885-afb2-f374af1adf47-kube-api-access-j49bk\") pod \"kserve-controller-manager-6986579898-n5mnl\" (UID: \"32c8d391-524f-4885-afb2-f374af1adf47\") " pod="kserve/kserve-controller-manager-6986579898-n5mnl" Apr 16 16:56:45.793671 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.793625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j49bk\" (UniqueName: \"kubernetes.io/projected/32c8d391-524f-4885-afb2-f374af1adf47-kube-api-access-j49bk\") pod \"kserve-controller-manager-6986579898-n5mnl\" (UID: \"32c8d391-524f-4885-afb2-f374af1adf47\") " pod="kserve/kserve-controller-manager-6986579898-n5mnl" Apr 16 16:56:45.793822 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.793768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32c8d391-524f-4885-afb2-f374af1adf47-cert\") pod \"kserve-controller-manager-6986579898-n5mnl\" (UID: \"32c8d391-524f-4885-afb2-f374af1adf47\") " pod="kserve/kserve-controller-manager-6986579898-n5mnl" Apr 16 16:56:45.796333 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.796313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32c8d391-524f-4885-afb2-f374af1adf47-cert\") pod \"kserve-controller-manager-6986579898-n5mnl\" (UID: \"32c8d391-524f-4885-afb2-f374af1adf47\") " pod="kserve/kserve-controller-manager-6986579898-n5mnl" Apr 16 16:56:45.802791 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.802766 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49bk\" (UniqueName: \"kubernetes.io/projected/32c8d391-524f-4885-afb2-f374af1adf47-kube-api-access-j49bk\") pod \"kserve-controller-manager-6986579898-n5mnl\" (UID: \"32c8d391-524f-4885-afb2-f374af1adf47\") " pod="kserve/kserve-controller-manager-6986579898-n5mnl" Apr 16 16:56:45.852778 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.852751 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:45.895027 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.894949 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3185357-5f43-4a60-845b-a5173fe12a98-cert\") pod \"e3185357-5f43-4a60-845b-a5173fe12a98\" (UID: \"e3185357-5f43-4a60-845b-a5173fe12a98\") " Apr 16 16:56:45.895027 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.895004 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2b4z\" (UniqueName: \"kubernetes.io/projected/e3185357-5f43-4a60-845b-a5173fe12a98-kube-api-access-j2b4z\") pod \"e3185357-5f43-4a60-845b-a5173fe12a98\" (UID: \"e3185357-5f43-4a60-845b-a5173fe12a98\") " Apr 16 16:56:45.897046 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.897020 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3185357-5f43-4a60-845b-a5173fe12a98-kube-api-access-j2b4z" (OuterVolumeSpecName: "kube-api-access-j2b4z") pod "e3185357-5f43-4a60-845b-a5173fe12a98" (UID: "e3185357-5f43-4a60-845b-a5173fe12a98"). InnerVolumeSpecName "kube-api-access-j2b4z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:56:45.897170 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.897044 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3185357-5f43-4a60-845b-a5173fe12a98-cert" (OuterVolumeSpecName: "cert") pod "e3185357-5f43-4a60-845b-a5173fe12a98" (UID: "e3185357-5f43-4a60-845b-a5173fe12a98"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:56:45.989607 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.989569 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-n5mnl" Apr 16 16:56:45.995681 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.995661 2572 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3185357-5f43-4a60-845b-a5173fe12a98-cert\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:56:45.995739 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:45.995684 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j2b4z\" (UniqueName: \"kubernetes.io/projected/e3185357-5f43-4a60-845b-a5173fe12a98-kube-api-access-j2b4z\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:56:46.108572 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.108549 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6986579898-n5mnl"] Apr 16 16:56:46.110335 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:56:46.110311 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c8d391_524f_4885_afb2_f374af1adf47.slice/crio-e66d0864acbae8c558391c685c8e4556ba956f45b36a385a156b0f5c4065d269 WatchSource:0}: Error finding container e66d0864acbae8c558391c685c8e4556ba956f45b36a385a156b0f5c4065d269: Status 404 returned error can't find the container with id e66d0864acbae8c558391c685c8e4556ba956f45b36a385a156b0f5c4065d269 Apr 16 16:56:46.539814 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.539783 2572 generic.go:358] "Generic (PLEG): container finished" podID="e3185357-5f43-4a60-845b-a5173fe12a98" containerID="b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55" exitCode=0 Apr 16 16:56:46.539979 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.539848 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-c249f" Apr 16 16:56:46.539979 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.539864 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-c249f" event={"ID":"e3185357-5f43-4a60-845b-a5173fe12a98","Type":"ContainerDied","Data":"b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55"} Apr 16 16:56:46.539979 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.539900 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-c249f" event={"ID":"e3185357-5f43-4a60-845b-a5173fe12a98","Type":"ContainerDied","Data":"9e457b798eb729da1a949be9f97782d6c45515fcd9b06474c3e06e7facb0b396"} Apr 16 16:56:46.539979 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.539915 2572 scope.go:117] "RemoveContainer" containerID="b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55" Apr 16 16:56:46.541372 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.541344 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-n5mnl" event={"ID":"32c8d391-524f-4885-afb2-f374af1adf47","Type":"ContainerStarted","Data":"2756cc4ddbe0e1578fe05187760d666c08534fb1bfad25b91aea1ca6894d0855"} Apr 16 16:56:46.541372 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.541378 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-n5mnl" event={"ID":"32c8d391-524f-4885-afb2-f374af1adf47","Type":"ContainerStarted","Data":"e66d0864acbae8c558391c685c8e4556ba956f45b36a385a156b0f5c4065d269"} Apr 16 16:56:46.541551 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.541470 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6986579898-n5mnl" Apr 16 16:56:46.548200 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.548183 2572 scope.go:117] "RemoveContainer" containerID="b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55" Apr 16 16:56:46.548450 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:56:46.548428 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55\": container with ID starting with b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55 not found: ID does not exist" containerID="b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55" Apr 16 16:56:46.548546 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.548462 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55"} err="failed to get container status \"b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55\": rpc error: code = NotFound desc = could not find container \"b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55\": container with ID starting with b7c717c1f43966bd15c48c37031503b9b0ce3627228fc2d9206ea0fd4be53a55 not found: ID does not exist" Apr 16 16:56:46.561173 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.561131 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6986579898-n5mnl" podStartSLOduration=1.211332641 podStartE2EDuration="1.561121023s" podCreationTimestamp="2026-04-16 16:56:45 +0000 UTC" firstStartedPulling="2026-04-16 16:56:46.111546043 +0000 UTC m=+523.280257384" lastFinishedPulling="2026-04-16 16:56:46.461334424 +0000 UTC m=+523.630045766" observedRunningTime="2026-04-16 16:56:46.558813319 +0000 UTC m=+523.727524681" watchObservedRunningTime="2026-04-16 16:56:46.561121023 +0000 UTC m=+523.729832402" Apr 16 16:56:46.573709 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.573685 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6986579898-c249f"] Apr 16 16:56:46.575794 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:46.575770 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6986579898-c249f"] Apr 16 16:56:47.590930 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:56:47.590896 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3185357-5f43-4a60-845b-a5173fe12a98" path="/var/lib/kubelet/pods/e3185357-5f43-4a60-845b-a5173fe12a98/volumes" Apr 16 16:57:17.551237 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:17.551199 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6986579898-n5mnl" Apr 16 16:57:18.449711 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.449678 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-5qnd9"] Apr 16 16:57:18.450017 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.450005 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3185357-5f43-4a60-845b-a5173fe12a98" containerName="manager" Apr 16 16:57:18.450086 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.450018 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3185357-5f43-4a60-845b-a5173fe12a98" containerName="manager" Apr 16 16:57:18.450169 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.450114 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e3185357-5f43-4a60-845b-a5173fe12a98" containerName="manager" Apr 16 16:57:18.453253 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.453233 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-7cvhc"] Apr 16 16:57:18.453396 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.453377 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-5qnd9" Apr 16 16:57:18.455919 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.455897 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 16:57:18.456030 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.455990 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-hr7lm\"" Apr 16 16:57:18.456521 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.456320 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7cvhc" Apr 16 16:57:18.458531 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.458513 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 16:57:18.458769 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.458752 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-g5rnq\"" Apr 16 16:57:18.462944 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.462919 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-5qnd9"] Apr 16 16:57:18.474902 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.474883 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7cvhc"] Apr 16 16:57:18.558654 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.558617 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bd7v\" (UniqueName: \"kubernetes.io/projected/adbd0b4b-af77-4b37-91f2-e1b99377e319-kube-api-access-5bd7v\") pod \"odh-model-controller-696fc77849-7cvhc\" (UID: \"adbd0b4b-af77-4b37-91f2-e1b99377e319\") " pod="kserve/odh-model-controller-696fc77849-7cvhc" Apr 16 16:57:18.558654 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.558665 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11703ac7-7f53-4289-9ee4-19d11376828a-tls-certs\") pod \"model-serving-api-86f7b4b499-5qnd9\" (UID: \"11703ac7-7f53-4289-9ee4-19d11376828a\") " pod="kserve/model-serving-api-86f7b4b499-5qnd9" Apr 16 16:57:18.559168 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.558712 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkhkv\" (UniqueName: \"kubernetes.io/projected/11703ac7-7f53-4289-9ee4-19d11376828a-kube-api-access-rkhkv\") pod \"model-serving-api-86f7b4b499-5qnd9\" (UID: \"11703ac7-7f53-4289-9ee4-19d11376828a\") " pod="kserve/model-serving-api-86f7b4b499-5qnd9" Apr 16 16:57:18.559168 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.558767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adbd0b4b-af77-4b37-91f2-e1b99377e319-cert\") pod \"odh-model-controller-696fc77849-7cvhc\" (UID: \"adbd0b4b-af77-4b37-91f2-e1b99377e319\") " pod="kserve/odh-model-controller-696fc77849-7cvhc" Apr 16 16:57:18.659705 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.659663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bd7v\" (UniqueName: \"kubernetes.io/projected/adbd0b4b-af77-4b37-91f2-e1b99377e319-kube-api-access-5bd7v\") pod \"odh-model-controller-696fc77849-7cvhc\" (UID: \"adbd0b4b-af77-4b37-91f2-e1b99377e319\") " pod="kserve/odh-model-controller-696fc77849-7cvhc" Apr 16 16:57:18.659887 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.659711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11703ac7-7f53-4289-9ee4-19d11376828a-tls-certs\") pod \"model-serving-api-86f7b4b499-5qnd9\" (UID: \"11703ac7-7f53-4289-9ee4-19d11376828a\") " pod="kserve/model-serving-api-86f7b4b499-5qnd9" Apr 16 16:57:18.659887 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.659736 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkhkv\" (UniqueName: \"kubernetes.io/projected/11703ac7-7f53-4289-9ee4-19d11376828a-kube-api-access-rkhkv\") pod \"model-serving-api-86f7b4b499-5qnd9\" (UID: \"11703ac7-7f53-4289-9ee4-19d11376828a\") " pod="kserve/model-serving-api-86f7b4b499-5qnd9" Apr 16 16:57:18.659887 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.659778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adbd0b4b-af77-4b37-91f2-e1b99377e319-cert\") pod \"odh-model-controller-696fc77849-7cvhc\" (UID: \"adbd0b4b-af77-4b37-91f2-e1b99377e319\") " pod="kserve/odh-model-controller-696fc77849-7cvhc" Apr 16 16:57:18.662593 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.662567 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adbd0b4b-af77-4b37-91f2-e1b99377e319-cert\") pod \"odh-model-controller-696fc77849-7cvhc\" (UID: \"adbd0b4b-af77-4b37-91f2-e1b99377e319\") " pod="kserve/odh-model-controller-696fc77849-7cvhc" Apr 16 16:57:18.662690 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.662571 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11703ac7-7f53-4289-9ee4-19d11376828a-tls-certs\") pod \"model-serving-api-86f7b4b499-5qnd9\" (UID: \"11703ac7-7f53-4289-9ee4-19d11376828a\") " pod="kserve/model-serving-api-86f7b4b499-5qnd9" Apr 16 16:57:18.671318 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.668148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bd7v\" (UniqueName: \"kubernetes.io/projected/adbd0b4b-af77-4b37-91f2-e1b99377e319-kube-api-access-5bd7v\") pod \"odh-model-controller-696fc77849-7cvhc\" (UID: \"adbd0b4b-af77-4b37-91f2-e1b99377e319\") " pod="kserve/odh-model-controller-696fc77849-7cvhc" Apr 16 16:57:18.671318 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.668287 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkhkv\" (UniqueName: \"kubernetes.io/projected/11703ac7-7f53-4289-9ee4-19d11376828a-kube-api-access-rkhkv\") pod \"model-serving-api-86f7b4b499-5qnd9\" (UID: \"11703ac7-7f53-4289-9ee4-19d11376828a\") " pod="kserve/model-serving-api-86f7b4b499-5qnd9" Apr 16 16:57:18.767139 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.767050 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-5qnd9" Apr 16 16:57:18.774753 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.774731 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7cvhc" Apr 16 16:57:18.918481 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:18.918452 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7cvhc"] Apr 16 16:57:18.921116 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:57:18.921087 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbd0b4b_af77_4b37_91f2_e1b99377e319.slice/crio-3ce468f4a973e30ce91d581e28fa81758799cf8c8d2831352812c2c461d15cc4 WatchSource:0}: Error finding container 3ce468f4a973e30ce91d581e28fa81758799cf8c8d2831352812c2c461d15cc4: Status 404 returned error can't find the container with id 3ce468f4a973e30ce91d581e28fa81758799cf8c8d2831352812c2c461d15cc4 Apr 16 16:57:19.100214 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:19.100192 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-5qnd9"] Apr 16 16:57:19.101338 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:57:19.101317 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11703ac7_7f53_4289_9ee4_19d11376828a.slice/crio-54140ddfe59e291556561c8496a697f89db35c3c590eebb0989a42cca9d39e85 WatchSource:0}: Error finding container 54140ddfe59e291556561c8496a697f89db35c3c590eebb0989a42cca9d39e85: Status 404 returned error can't find the container with id 54140ddfe59e291556561c8496a697f89db35c3c590eebb0989a42cca9d39e85 Apr 16 16:57:19.660658 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:19.660533 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7cvhc" event={"ID":"adbd0b4b-af77-4b37-91f2-e1b99377e319","Type":"ContainerStarted","Data":"3ce468f4a973e30ce91d581e28fa81758799cf8c8d2831352812c2c461d15cc4"} Apr 16 16:57:19.662101 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:19.662029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-5qnd9" event={"ID":"11703ac7-7f53-4289-9ee4-19d11376828a","Type":"ContainerStarted","Data":"54140ddfe59e291556561c8496a697f89db35c3c590eebb0989a42cca9d39e85"} Apr 16 16:57:21.672165 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:21.672128 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7cvhc" event={"ID":"adbd0b4b-af77-4b37-91f2-e1b99377e319","Type":"ContainerStarted","Data":"957b4fb9e2550249abfa1eed276ea249ba35b8112cb5cf124bd0dac10b6b8032"} Apr 16 16:57:21.672596 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:21.672187 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-7cvhc" Apr 16 16:57:21.673484 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:21.673463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-5qnd9" event={"ID":"11703ac7-7f53-4289-9ee4-19d11376828a","Type":"ContainerStarted","Data":"1a74d22781a8654ecd39bfe77d03b4a134b4fb42785f2d6e1c8c7cd63e8eeb1a"} Apr 16 16:57:21.673595 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:21.673583 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-5qnd9" Apr 16 16:57:21.688481 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:21.688442 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-7cvhc" podStartSLOduration=1.15295561 podStartE2EDuration="3.688430108s" podCreationTimestamp="2026-04-16 16:57:18 +0000 UTC" firstStartedPulling="2026-04-16 16:57:18.92227589 +0000 UTC m=+556.090987231" lastFinishedPulling="2026-04-16 16:57:21.457750383 +0000 UTC m=+558.626461729" observedRunningTime="2026-04-16 16:57:21.688372779 +0000 UTC m=+558.857084146" watchObservedRunningTime="2026-04-16 16:57:21.688430108 +0000 UTC m=+558.857141481" Apr 16 16:57:21.704478 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:21.704388 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-5qnd9" podStartSLOduration=1.353469153 podStartE2EDuration="3.704372739s" podCreationTimestamp="2026-04-16 16:57:18 +0000 UTC" firstStartedPulling="2026-04-16 16:57:19.102923318 +0000 UTC m=+556.271634660" lastFinishedPulling="2026-04-16 16:57:21.453826901 +0000 UTC m=+558.622538246" observedRunningTime="2026-04-16 16:57:21.703684666 +0000 UTC m=+558.872396028" watchObservedRunningTime="2026-04-16 16:57:21.704372739 +0000 UTC m=+558.873084106" Apr 16 16:57:32.679153 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:32.679123 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-7cvhc" Apr 16 16:57:32.680831 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:32.680812 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-5qnd9" Apr 16 16:57:33.523745 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:33.523708 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-9vjtv"] Apr 16 16:57:33.527210 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:33.527194 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9vjtv" Apr 16 16:57:33.533621 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:33.533593 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-9vjtv"] Apr 16 16:57:33.570356 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:33.570317 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2kgl\" (UniqueName: \"kubernetes.io/projected/7985ba5d-5777-4ab1-9f78-7878d9cbf8f2-kube-api-access-m2kgl\") pod \"s3-init-9vjtv\" (UID: \"7985ba5d-5777-4ab1-9f78-7878d9cbf8f2\") " pod="kserve/s3-init-9vjtv" Apr 16 16:57:33.671477 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:33.671434 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2kgl\" (UniqueName: \"kubernetes.io/projected/7985ba5d-5777-4ab1-9f78-7878d9cbf8f2-kube-api-access-m2kgl\") pod \"s3-init-9vjtv\" (UID: \"7985ba5d-5777-4ab1-9f78-7878d9cbf8f2\") " pod="kserve/s3-init-9vjtv" Apr 16 16:57:33.680236 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:33.680211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2kgl\" (UniqueName: \"kubernetes.io/projected/7985ba5d-5777-4ab1-9f78-7878d9cbf8f2-kube-api-access-m2kgl\") pod \"s3-init-9vjtv\" (UID: \"7985ba5d-5777-4ab1-9f78-7878d9cbf8f2\") " pod="kserve/s3-init-9vjtv" Apr 16 16:57:33.837040 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:33.836953 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9vjtv" Apr 16 16:57:33.959618 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:33.959590 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-9vjtv"] Apr 16 16:57:33.960609 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:57:33.960586 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7985ba5d_5777_4ab1_9f78_7878d9cbf8f2.slice/crio-d6daf9660717029d663fb9b36c73f6deca55abd399f463834dab90db00bd0f98 WatchSource:0}: Error finding container d6daf9660717029d663fb9b36c73f6deca55abd399f463834dab90db00bd0f98: Status 404 returned error can't find the container with id d6daf9660717029d663fb9b36c73f6deca55abd399f463834dab90db00bd0f98 Apr 16 16:57:34.723884 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:34.723829 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9vjtv" event={"ID":"7985ba5d-5777-4ab1-9f78-7878d9cbf8f2","Type":"ContainerStarted","Data":"d6daf9660717029d663fb9b36c73f6deca55abd399f463834dab90db00bd0f98"} Apr 16 16:57:38.741876 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:38.741843 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9vjtv" event={"ID":"7985ba5d-5777-4ab1-9f78-7878d9cbf8f2","Type":"ContainerStarted","Data":"21f884b479e2545f86d5179a0ecaeb0eaff8484a3247395ccf7a4b740d6009f7"} Apr 16 16:57:38.787331 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:38.787269 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-9vjtv" podStartSLOduration=1.489741538 podStartE2EDuration="5.787251175s" podCreationTimestamp="2026-04-16 16:57:33 +0000 UTC" firstStartedPulling="2026-04-16 16:57:33.962442869 +0000 UTC m=+571.131154214" lastFinishedPulling="2026-04-16 16:57:38.25995251 +0000 UTC m=+575.428663851" observedRunningTime="2026-04-16 16:57:38.786100798 +0000 UTC m=+575.954812157" watchObservedRunningTime="2026-04-16 16:57:38.787251175 +0000 UTC m=+575.955962538" Apr 16 16:57:41.754865 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:41.754833 2572 generic.go:358] "Generic (PLEG): container finished" podID="7985ba5d-5777-4ab1-9f78-7878d9cbf8f2" containerID="21f884b479e2545f86d5179a0ecaeb0eaff8484a3247395ccf7a4b740d6009f7" exitCode=0 Apr 16 16:57:41.755246 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:41.754905 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9vjtv" event={"ID":"7985ba5d-5777-4ab1-9f78-7878d9cbf8f2","Type":"ContainerDied","Data":"21f884b479e2545f86d5179a0ecaeb0eaff8484a3247395ccf7a4b740d6009f7"} Apr 16 16:57:42.888227 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:42.888202 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9vjtv" Apr 16 16:57:42.936230 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:42.936203 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2kgl\" (UniqueName: \"kubernetes.io/projected/7985ba5d-5777-4ab1-9f78-7878d9cbf8f2-kube-api-access-m2kgl\") pod \"7985ba5d-5777-4ab1-9f78-7878d9cbf8f2\" (UID: \"7985ba5d-5777-4ab1-9f78-7878d9cbf8f2\") " Apr 16 16:57:42.938178 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:42.938153 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7985ba5d-5777-4ab1-9f78-7878d9cbf8f2-kube-api-access-m2kgl" (OuterVolumeSpecName: "kube-api-access-m2kgl") pod "7985ba5d-5777-4ab1-9f78-7878d9cbf8f2" (UID: "7985ba5d-5777-4ab1-9f78-7878d9cbf8f2"). InnerVolumeSpecName "kube-api-access-m2kgl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:57:43.036794 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:43.036720 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m2kgl\" (UniqueName: \"kubernetes.io/projected/7985ba5d-5777-4ab1-9f78-7878d9cbf8f2-kube-api-access-m2kgl\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:57:43.762315 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:43.762283 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9vjtv" Apr 16 16:57:43.762315 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:43.762296 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9vjtv" event={"ID":"7985ba5d-5777-4ab1-9f78-7878d9cbf8f2","Type":"ContainerDied","Data":"d6daf9660717029d663fb9b36c73f6deca55abd399f463834dab90db00bd0f98"} Apr 16 16:57:43.762519 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:57:43.762329 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6daf9660717029d663fb9b36c73f6deca55abd399f463834dab90db00bd0f98" Apr 16 16:58:03.496921 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:03.496897 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 16:58:03.497407 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:03.497075 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 16:58:19.952511 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:19.952433 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj"] Apr 16 16:58:19.954871 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:19.953012 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7985ba5d-5777-4ab1-9f78-7878d9cbf8f2" containerName="s3-init" Apr 16 16:58:19.954871 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:19.953075 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7985ba5d-5777-4ab1-9f78-7878d9cbf8f2" containerName="s3-init" Apr 16 16:58:19.954871 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:19.953164 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7985ba5d-5777-4ab1-9f78-7878d9cbf8f2" containerName="s3-init" Apr 16 16:58:19.956261 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:19.956241 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:19.960221 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:19.960191 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:58:19.960344 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:19.960220 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-g4psd\"" Apr 16 16:58:19.960344 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:19.960259 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:58:19.960344 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:19.960222 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 16:58:19.964799 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:19.964759 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj"] Apr 16 16:58:20.017204 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.017176 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kn4c\" (UniqueName: \"kubernetes.io/projected/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kube-api-access-8kn4c\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.017204 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.017208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-home\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.017461 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.017228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-dshm\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.017461 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.017250 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.017461 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.017301 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.017461 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.017356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.118703 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.118671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.118865 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.118739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kn4c\" (UniqueName: \"kubernetes.io/projected/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kube-api-access-8kn4c\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.118865 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.118766 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-home\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.118865 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.118795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-dshm\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.118865 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.118820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.118865 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.118844 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.119155 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.119129 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.119250 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.119199 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-home\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.119362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.119341 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.121133 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.121109 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-dshm\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.121476 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.121459 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.126977 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.126958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kn4c\" (UniqueName: \"kubernetes.io/projected/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kube-api-access-8kn4c\") pod \"scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.269972 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.269894 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:20.402052 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.402013 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj"] Apr 16 16:58:20.403449 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:58:20.403408 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd451ecfc_fb0f_4ef5_bcf4_c4d85bb7a62c.slice/crio-d76f61e4630f801fdb4f82d3c6ed19296f2a1a48b788ed0f84d92c8aee53f044 WatchSource:0}: Error finding container d76f61e4630f801fdb4f82d3c6ed19296f2a1a48b788ed0f84d92c8aee53f044: Status 404 returned error can't find the container with id d76f61e4630f801fdb4f82d3c6ed19296f2a1a48b788ed0f84d92c8aee53f044 Apr 16 16:58:20.405303 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.405288 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:58:20.906862 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:20.906819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" event={"ID":"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c","Type":"ContainerStarted","Data":"d76f61e4630f801fdb4f82d3c6ed19296f2a1a48b788ed0f84d92c8aee53f044"} Apr 16 16:58:23.924021 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:23.923996 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" event={"ID":"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c","Type":"ContainerStarted","Data":"94eadd4ecdf5b6ceb4966dee3f8399b67c9df93aad4e5b7eed4084cb61d53437"} Apr 16 16:58:28.945178 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:28.945147 2572 generic.go:358] "Generic (PLEG): container finished" podID="d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" containerID="94eadd4ecdf5b6ceb4966dee3f8399b67c9df93aad4e5b7eed4084cb61d53437" exitCode=0 Apr 16 16:58:28.945554 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:28.945223 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" event={"ID":"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c","Type":"ContainerDied","Data":"94eadd4ecdf5b6ceb4966dee3f8399b67c9df93aad4e5b7eed4084cb61d53437"} Apr 16 16:58:29.532525 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.532490 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz"] Apr 16 16:58:29.539861 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.539708 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.544140 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.543889 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 16:58:29.544786 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.544765 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz"] Apr 16 16:58:29.599232 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.599192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.599394 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.599250 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.599394 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.599321 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db79fde-00cf-42ea-86fe-9094aca731f5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.599394 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.599362 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.599559 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.599390 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.599559 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.599449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpbbg\" (UniqueName: \"kubernetes.io/projected/8db79fde-00cf-42ea-86fe-9094aca731f5-kube-api-access-wpbbg\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.701465 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.701426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.701661 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.701499 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.701661 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.701553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db79fde-00cf-42ea-86fe-9094aca731f5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.701661 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.701588 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.701661 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.701615 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.701661 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.701633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpbbg\" (UniqueName: \"kubernetes.io/projected/8db79fde-00cf-42ea-86fe-9094aca731f5-kube-api-access-wpbbg\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.702488 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.702236 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.702488 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.702409 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.702488 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.702454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.705310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.704566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.705310 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.705154 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db79fde-00cf-42ea-86fe-9094aca731f5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.709708 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.709683 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpbbg\" (UniqueName: \"kubernetes.io/projected/8db79fde-00cf-42ea-86fe-9094aca731f5-kube-api-access-wpbbg\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:29.856486 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:29.856412 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:30.270109 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:30.270083 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz"] Apr 16 16:58:30.271616 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:58:30.271589 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db79fde_00cf_42ea_86fe_9094aca731f5.slice/crio-d8d653d5b79f072172160c0f30ccaf335f4fdf3510778221a902253a3b9e4163 WatchSource:0}: Error finding container d8d653d5b79f072172160c0f30ccaf335f4fdf3510778221a902253a3b9e4163: Status 404 returned error can't find the container with id d8d653d5b79f072172160c0f30ccaf335f4fdf3510778221a902253a3b9e4163 Apr 16 16:58:30.954233 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:30.954199 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" event={"ID":"8db79fde-00cf-42ea-86fe-9094aca731f5","Type":"ContainerStarted","Data":"f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833"} Apr 16 16:58:30.954415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:30.954239 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" event={"ID":"8db79fde-00cf-42ea-86fe-9094aca731f5","Type":"ContainerStarted","Data":"d8d653d5b79f072172160c0f30ccaf335f4fdf3510778221a902253a3b9e4163"} Apr 16 16:58:30.955890 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:30.955866 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" event={"ID":"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c","Type":"ContainerStarted","Data":"ba6fd676ce4e934563bd8f3498683ab3d1d068422bf5939062e0d0e367d6bddc"} Apr 16 16:58:30.989495 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:30.989449 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" podStartSLOduration=2.183479827 podStartE2EDuration="11.989438153s" podCreationTimestamp="2026-04-16 16:58:19 +0000 UTC" firstStartedPulling="2026-04-16 16:58:20.405418107 +0000 UTC m=+617.574129449" lastFinishedPulling="2026-04-16 16:58:30.211376432 +0000 UTC m=+627.380087775" observedRunningTime="2026-04-16 16:58:30.987542178 +0000 UTC m=+628.156253541" watchObservedRunningTime="2026-04-16 16:58:30.989438153 +0000 UTC m=+628.158149512" Apr 16 16:58:34.971595 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:34.971566 2572 generic.go:358] "Generic (PLEG): container finished" podID="8db79fde-00cf-42ea-86fe-9094aca731f5" containerID="f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833" exitCode=0 Apr 16 16:58:34.971931 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:34.971638 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" event={"ID":"8db79fde-00cf-42ea-86fe-9094aca731f5","Type":"ContainerDied","Data":"f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833"} Apr 16 16:58:35.977129 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:35.977091 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" event={"ID":"8db79fde-00cf-42ea-86fe-9094aca731f5","Type":"ContainerStarted","Data":"da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e"} Apr 16 16:58:35.997391 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:35.997345 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" podStartSLOduration=6.997333086 podStartE2EDuration="6.997333086s" podCreationTimestamp="2026-04-16 16:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:58:35.994403796 +0000 UTC m=+633.163115159" watchObservedRunningTime="2026-04-16 16:58:35.997333086 +0000 UTC m=+633.166044448" Apr 16 16:58:39.856868 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:39.856793 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:39.856868 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:39.856831 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:39.868950 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:39.868927 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:40.004405 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:40.004371 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:40.270483 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:40.270451 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:40.270675 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:40.270517 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:40.282909 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:40.282880 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:41.007580 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:41.007551 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:58:53.403295 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.403259 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz"] Apr 16 16:58:53.403685 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.403514 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" podUID="8db79fde-00cf-42ea-86fe-9094aca731f5" containerName="main" containerID="cri-o://da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e" gracePeriod=30 Apr 16 16:58:53.676856 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.676833 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:53.802706 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.802673 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-dshm\") pod \"8db79fde-00cf-42ea-86fe-9094aca731f5\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " Apr 16 16:58:53.802865 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.802725 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-model-cache\") pod \"8db79fde-00cf-42ea-86fe-9094aca731f5\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " Apr 16 16:58:53.802865 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.802755 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpbbg\" (UniqueName: \"kubernetes.io/projected/8db79fde-00cf-42ea-86fe-9094aca731f5-kube-api-access-wpbbg\") pod \"8db79fde-00cf-42ea-86fe-9094aca731f5\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " Apr 16 16:58:53.802865 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.802839 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-kserve-provision-location\") pod \"8db79fde-00cf-42ea-86fe-9094aca731f5\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " Apr 16 16:58:53.803018 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.802870 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-home\") pod \"8db79fde-00cf-42ea-86fe-9094aca731f5\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " Apr 16 16:58:53.803018 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.802906 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db79fde-00cf-42ea-86fe-9094aca731f5-tls-certs\") pod \"8db79fde-00cf-42ea-86fe-9094aca731f5\" (UID: \"8db79fde-00cf-42ea-86fe-9094aca731f5\") " Apr 16 16:58:53.803018 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.802966 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-model-cache" (OuterVolumeSpecName: "model-cache") pod "8db79fde-00cf-42ea-86fe-9094aca731f5" (UID: "8db79fde-00cf-42ea-86fe-9094aca731f5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:58:53.803197 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.803139 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:58:53.803292 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.803270 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-home" (OuterVolumeSpecName: "home") pod "8db79fde-00cf-42ea-86fe-9094aca731f5" (UID: "8db79fde-00cf-42ea-86fe-9094aca731f5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:58:53.805045 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.805021 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db79fde-00cf-42ea-86fe-9094aca731f5-kube-api-access-wpbbg" (OuterVolumeSpecName: "kube-api-access-wpbbg") pod "8db79fde-00cf-42ea-86fe-9094aca731f5" (UID: "8db79fde-00cf-42ea-86fe-9094aca731f5"). InnerVolumeSpecName "kube-api-access-wpbbg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:58:53.805254 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.805235 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db79fde-00cf-42ea-86fe-9094aca731f5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8db79fde-00cf-42ea-86fe-9094aca731f5" (UID: "8db79fde-00cf-42ea-86fe-9094aca731f5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:58:53.805326 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.805238 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-dshm" (OuterVolumeSpecName: "dshm") pod "8db79fde-00cf-42ea-86fe-9094aca731f5" (UID: "8db79fde-00cf-42ea-86fe-9094aca731f5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:58:53.904362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.904333 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:58:53.904362 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.904361 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db79fde-00cf-42ea-86fe-9094aca731f5-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:58:53.904511 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.904369 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:58:53.904511 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:53.904378 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wpbbg\" (UniqueName: \"kubernetes.io/projected/8db79fde-00cf-42ea-86fe-9094aca731f5-kube-api-access-wpbbg\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:58:54.046140 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.046031 2572 generic.go:358] "Generic (PLEG): container finished" podID="8db79fde-00cf-42ea-86fe-9094aca731f5" containerID="da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e" exitCode=0 Apr 16 16:58:54.046140 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.046125 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" Apr 16 16:58:54.046325 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.046121 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" event={"ID":"8db79fde-00cf-42ea-86fe-9094aca731f5","Type":"ContainerDied","Data":"da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e"} Apr 16 16:58:54.046325 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.046240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz" event={"ID":"8db79fde-00cf-42ea-86fe-9094aca731f5","Type":"ContainerDied","Data":"d8d653d5b79f072172160c0f30ccaf335f4fdf3510778221a902253a3b9e4163"} Apr 16 16:58:54.046325 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.046262 2572 scope.go:117] "RemoveContainer" containerID="da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e" Apr 16 16:58:54.054888 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.054872 2572 scope.go:117] "RemoveContainer" containerID="f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833" Apr 16 16:58:54.122746 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.122716 2572 scope.go:117] "RemoveContainer" containerID="da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e" Apr 16 16:58:54.123497 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:58:54.123347 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e\": container with ID starting with da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e not found: ID does not exist" containerID="da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e" Apr 16 16:58:54.123614 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.123509 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e"} err="failed to get container status \"da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e\": rpc error: code = NotFound desc = could not find container \"da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e\": container with ID starting with da204fb1989245b44be9854cb130c3f87c518f0b89e2e65cea351059293c2e7e not found: ID does not exist" Apr 16 16:58:54.123614 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.123537 2572 scope.go:117] "RemoveContainer" containerID="f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833" Apr 16 16:58:54.123850 ip-10-0-137-126 kubenswrapper[2572]: E0416 16:58:54.123824 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833\": container with ID starting with f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833 not found: ID does not exist" containerID="f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833" Apr 16 16:58:54.123895 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.123856 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833"} err="failed to get container status \"f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833\": rpc error: code = NotFound desc = could not find container \"f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833\": container with ID starting with f14574097a5553b5da3bcc980624f256dfaf3a1786c09c3cd51f01aa7088d833 not found: ID does not exist" Apr 16 16:58:54.664591 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.664545 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8db79fde-00cf-42ea-86fe-9094aca731f5" (UID: "8db79fde-00cf-42ea-86fe-9094aca731f5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:58:54.709159 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.709129 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db79fde-00cf-42ea-86fe-9094aca731f5-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:58:54.969197 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.969170 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz"] Apr 16 16:58:54.972740 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:54.972720 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-85955b4ccbk5lnz"] Apr 16 16:58:55.596390 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:55.596346 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db79fde-00cf-42ea-86fe-9094aca731f5" path="/var/lib/kubelet/pods/8db79fde-00cf-42ea-86fe-9094aca731f5/volumes" Apr 16 16:58:58.342469 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.342436 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd"] Apr 16 16:58:58.342817 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.342790 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8db79fde-00cf-42ea-86fe-9094aca731f5" containerName="storage-initializer" Apr 16 16:58:58.342817 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.342801 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db79fde-00cf-42ea-86fe-9094aca731f5" containerName="storage-initializer" Apr 16 16:58:58.342817 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.342812 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8db79fde-00cf-42ea-86fe-9094aca731f5" containerName="main" Apr 16 16:58:58.342817 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.342817 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db79fde-00cf-42ea-86fe-9094aca731f5" containerName="main" Apr 16 16:58:58.342944 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.342876 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8db79fde-00cf-42ea-86fe-9094aca731f5" containerName="main" Apr 16 16:58:58.348274 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.348253 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.350959 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.350938 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 16:58:58.355634 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.355610 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd"] Apr 16 16:58:58.437201 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.437166 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llckq\" (UniqueName: \"kubernetes.io/projected/d4689eed-e55a-4551-b7ee-12e706d076aa-kube-api-access-llckq\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.437201 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.437201 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.437418 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.437223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.437418 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.437274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.437418 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.437327 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4689eed-e55a-4551-b7ee-12e706d076aa-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.437418 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.437353 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.538125 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.538089 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.538324 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.538151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.538324 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.538209 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4689eed-e55a-4551-b7ee-12e706d076aa-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.538324 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.538251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.538500 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.538426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llckq\" (UniqueName: \"kubernetes.io/projected/d4689eed-e55a-4551-b7ee-12e706d076aa-kube-api-access-llckq\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.538500 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.538477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.538605 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.538545 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.538673 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.538603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.538769 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.538731 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.540307 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.540288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.540535 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.540520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4689eed-e55a-4551-b7ee-12e706d076aa-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.551615 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.551582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llckq\" (UniqueName: \"kubernetes.io/projected/d4689eed-e55a-4551-b7ee-12e706d076aa-kube-api-access-llckq\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.660335 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.660301 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:58:58.811222 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:58.811193 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd"] Apr 16 16:58:58.811597 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:58:58.811575 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4689eed_e55a_4551_b7ee_12e706d076aa.slice/crio-4e1368cf281f067a1ff3e2c3213adb3d54813f0461855e9f5b847608bec914b1 WatchSource:0}: Error finding container 4e1368cf281f067a1ff3e2c3213adb3d54813f0461855e9f5b847608bec914b1: Status 404 returned error can't find the container with id 4e1368cf281f067a1ff3e2c3213adb3d54813f0461855e9f5b847608bec914b1 Apr 16 16:58:59.068261 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:59.068185 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" event={"ID":"d4689eed-e55a-4551-b7ee-12e706d076aa","Type":"ContainerStarted","Data":"2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447"} Apr 16 16:58:59.068261 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:58:59.068225 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" event={"ID":"d4689eed-e55a-4551-b7ee-12e706d076aa","Type":"ContainerStarted","Data":"4e1368cf281f067a1ff3e2c3213adb3d54813f0461855e9f5b847608bec914b1"} Apr 16 16:59:03.089684 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:03.089649 2572 generic.go:358] "Generic (PLEG): container finished" podID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerID="2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447" exitCode=0 Apr 16 16:59:03.090155 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:03.089718 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" event={"ID":"d4689eed-e55a-4551-b7ee-12e706d076aa","Type":"ContainerDied","Data":"2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447"} Apr 16 16:59:27.379507 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:27.379470 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj"] Apr 16 16:59:27.380042 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:27.379830 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" podUID="d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" containerName="main" containerID="cri-o://ba6fd676ce4e934563bd8f3498683ab3d1d068422bf5939062e0d0e367d6bddc" gracePeriod=30 Apr 16 16:59:28.214482 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.214446 2572 generic.go:358] "Generic (PLEG): container finished" podID="d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" containerID="ba6fd676ce4e934563bd8f3498683ab3d1d068422bf5939062e0d0e367d6bddc" exitCode=0 Apr 16 16:59:28.214669 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.214502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" event={"ID":"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c","Type":"ContainerDied","Data":"ba6fd676ce4e934563bd8f3498683ab3d1d068422bf5939062e0d0e367d6bddc"} Apr 16 16:59:28.733636 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.733611 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:59:28.811946 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.811914 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kserve-provision-location\") pod \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " Apr 16 16:59:28.811946 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.811950 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-dshm\") pod \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " Apr 16 16:59:28.812209 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.811978 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kn4c\" (UniqueName: \"kubernetes.io/projected/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kube-api-access-8kn4c\") pod \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " Apr 16 16:59:28.812209 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.812050 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-home\") pod \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " Apr 16 16:59:28.812209 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.812099 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-tls-certs\") pod \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " Apr 16 16:59:28.812209 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.812144 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-model-cache\") pod \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\" (UID: \"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c\") " Apr 16 16:59:28.812415 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.812301 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-home" (OuterVolumeSpecName: "home") pod "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" (UID: "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:59:28.812471 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.812428 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:59:28.812540 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.812510 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-model-cache" (OuterVolumeSpecName: "model-cache") pod "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" (UID: "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:59:28.814124 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.814101 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kube-api-access-8kn4c" (OuterVolumeSpecName: "kube-api-access-8kn4c") pod "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" (UID: "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c"). InnerVolumeSpecName "kube-api-access-8kn4c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:59:28.814397 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.814373 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-dshm" (OuterVolumeSpecName: "dshm") pod "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" (UID: "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:59:28.814494 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.814375 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" (UID: "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:59:28.870426 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.870390 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" (UID: "d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:59:28.913226 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.913200 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:59:28.913226 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.913225 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:59:28.913378 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.913235 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:59:28.913378 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.913245 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:59:28.913378 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:28.913255 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kn4c\" (UniqueName: \"kubernetes.io/projected/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c-kube-api-access-8kn4c\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 16:59:29.220186 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:29.220159 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" event={"ID":"d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c","Type":"ContainerDied","Data":"d76f61e4630f801fdb4f82d3c6ed19296f2a1a48b788ed0f84d92c8aee53f044"} Apr 16 16:59:29.220279 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:29.220202 2572 scope.go:117] "RemoveContainer" containerID="ba6fd676ce4e934563bd8f3498683ab3d1d068422bf5939062e0d0e367d6bddc" Apr 16 16:59:29.220279 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:29.220225 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj" Apr 16 16:59:29.228699 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:29.228677 2572 scope.go:117] "RemoveContainer" containerID="94eadd4ecdf5b6ceb4966dee3f8399b67c9df93aad4e5b7eed4084cb61d53437" Apr 16 16:59:29.266396 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:29.266370 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj"] Apr 16 16:59:29.271111 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:29.271083 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7f9f4fb44-tnllj"] Apr 16 16:59:29.591617 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:29.591575 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" path="/var/lib/kubelet/pods/d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c/volumes" Apr 16 16:59:30.226648 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:30.226613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" event={"ID":"d4689eed-e55a-4551-b7ee-12e706d076aa","Type":"ContainerStarted","Data":"200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747"} Apr 16 16:59:30.248266 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:30.248211 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" podStartSLOduration=6.225186147 podStartE2EDuration="32.248192924s" podCreationTimestamp="2026-04-16 16:58:58 +0000 UTC" firstStartedPulling="2026-04-16 16:59:03.090710629 +0000 UTC m=+660.259421971" lastFinishedPulling="2026-04-16 16:59:29.113717406 +0000 UTC m=+686.282428748" observedRunningTime="2026-04-16 16:59:30.244828695 +0000 UTC m=+687.413540057" watchObservedRunningTime="2026-04-16 16:59:30.248192924 +0000 UTC m=+687.416904286" Apr 16 16:59:37.531844 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.531805 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t"] Apr 16 16:59:37.532771 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.532355 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" containerName="storage-initializer" Apr 16 16:59:37.532771 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.532375 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" containerName="storage-initializer" Apr 16 16:59:37.532771 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.532408 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" containerName="main" Apr 16 16:59:37.532771 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.532415 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" containerName="main" Apr 16 16:59:37.532771 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.532494 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d451ecfc-fb0f-4ef5-bcf4-c4d85bb7a62c" containerName="main" Apr 16 16:59:37.565532 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.565495 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t"] Apr 16 16:59:37.565686 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.565618 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.568672 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.568649 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 16:59:37.691107 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.691047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-dshm\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.691304 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.691114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.691304 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.691194 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-home\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.691304 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.691221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfzzp\" (UniqueName: \"kubernetes.io/projected/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kube-api-access-lfzzp\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.691476 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.691326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/784ba4f2-722d-4fc0-8ffe-c19a488607b2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.691476 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.691358 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-model-cache\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.768260 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.768228 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7"] Apr 16 16:59:37.772735 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.772714 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.775734 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.775714 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-tkh64\"" Apr 16 16:59:37.783821 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.783764 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7"] Apr 16 16:59:37.791904 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.791878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-dshm\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.791996 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.791918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.791996 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.791957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-home\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.792098 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.792016 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfzzp\" (UniqueName: \"kubernetes.io/projected/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kube-api-access-lfzzp\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.792158 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.792131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/784ba4f2-722d-4fc0-8ffe-c19a488607b2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.792212 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.792174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-model-cache\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.792549 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.792467 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.792910 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.792500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-model-cache\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.793036 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.792853 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-home\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.794290 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.794270 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-dshm\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.795222 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.795202 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/784ba4f2-722d-4fc0-8ffe-c19a488607b2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.804721 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.804677 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfzzp\" (UniqueName: \"kubernetes.io/projected/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kube-api-access-lfzzp\") pod \"scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.876719 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.876681 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:37.892813 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.892775 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.892962 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.892836 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psc5r\" (UniqueName: \"kubernetes.io/projected/ea556167-a6d8-4909-8c64-84c6894c322b-kube-api-access-psc5r\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.892962 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.892869 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.892962 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.892916 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea556167-a6d8-4909-8c64-84c6894c322b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.892962 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.892941 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.893177 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.893091 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.994238 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.994202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.994411 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.994259 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psc5r\" (UniqueName: \"kubernetes.io/projected/ea556167-a6d8-4909-8c64-84c6894c322b-kube-api-access-psc5r\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.994411 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.994299 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.994411 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.994347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea556167-a6d8-4909-8c64-84c6894c322b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.994411 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.994370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.994632 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.994457 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.994632 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.994518 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.994754 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.994652 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.994851 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.994823 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.994987 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.994866 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:37.996705 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:37.996685 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea556167-a6d8-4909-8c64-84c6894c322b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:38.010163 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:38.010139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psc5r\" (UniqueName: \"kubernetes.io/projected/ea556167-a6d8-4909-8c64-84c6894c322b-kube-api-access-psc5r\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:38.028561 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:38.028541 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t"] Apr 16 16:59:38.030394 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:59:38.030371 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod784ba4f2_722d_4fc0_8ffe_c19a488607b2.slice/crio-caf52847a66e09a4338cc5bc342647102c9e402d82340acfa76d0ff8225afb9f WatchSource:0}: Error finding container caf52847a66e09a4338cc5bc342647102c9e402d82340acfa76d0ff8225afb9f: Status 404 returned error can't find the container with id caf52847a66e09a4338cc5bc342647102c9e402d82340acfa76d0ff8225afb9f Apr 16 16:59:38.082545 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:38.082517 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 16:59:38.225316 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:38.225286 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7"] Apr 16 16:59:38.227178 ip-10-0-137-126 kubenswrapper[2572]: W0416 16:59:38.227149 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea556167_a6d8_4909_8c64_84c6894c322b.slice/crio-538256d9e69a4e64bfe648d3713a187bb6e46cd34157e7aa0f037293a18735c5 WatchSource:0}: Error finding container 538256d9e69a4e64bfe648d3713a187bb6e46cd34157e7aa0f037293a18735c5: Status 404 returned error can't find the container with id 538256d9e69a4e64bfe648d3713a187bb6e46cd34157e7aa0f037293a18735c5 Apr 16 16:59:38.258974 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:38.258789 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" event={"ID":"784ba4f2-722d-4fc0-8ffe-c19a488607b2","Type":"ContainerStarted","Data":"c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258"} Apr 16 16:59:38.258974 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:38.258831 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" event={"ID":"784ba4f2-722d-4fc0-8ffe-c19a488607b2","Type":"ContainerStarted","Data":"caf52847a66e09a4338cc5bc342647102c9e402d82340acfa76d0ff8225afb9f"} Apr 16 16:59:38.260369 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:38.260339 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" event={"ID":"ea556167-a6d8-4909-8c64-84c6894c322b","Type":"ContainerStarted","Data":"538256d9e69a4e64bfe648d3713a187bb6e46cd34157e7aa0f037293a18735c5"} Apr 16 16:59:38.661257 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:38.661222 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:59:38.661257 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:38.661258 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 16:59:38.662245 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:38.662197 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 16:59:39.266584 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:39.266546 2572 generic.go:358] "Generic (PLEG): container finished" podID="ea556167-a6d8-4909-8c64-84c6894c322b" containerID="9517175bc439ce04fa674b04be31ff52c6cdd919afcbe99226385bd234fb4139" exitCode=0 Apr 16 16:59:39.266750 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:39.266623 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" event={"ID":"ea556167-a6d8-4909-8c64-84c6894c322b","Type":"ContainerDied","Data":"9517175bc439ce04fa674b04be31ff52c6cdd919afcbe99226385bd234fb4139"} Apr 16 16:59:41.279786 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:41.279737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" event={"ID":"ea556167-a6d8-4909-8c64-84c6894c322b","Type":"ContainerStarted","Data":"7fdfa3cbfe32e440a9327f5776e2d96b4927e3501ab40f2b093e226b4bdc9d6e"} Apr 16 16:59:43.290370 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:43.290330 2572 generic.go:358] "Generic (PLEG): container finished" podID="784ba4f2-722d-4fc0-8ffe-c19a488607b2" containerID="c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258" exitCode=0 Apr 16 16:59:43.290843 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:43.290413 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" event={"ID":"784ba4f2-722d-4fc0-8ffe-c19a488607b2","Type":"ContainerDied","Data":"c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258"} Apr 16 16:59:44.297258 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:44.297218 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" event={"ID":"784ba4f2-722d-4fc0-8ffe-c19a488607b2","Type":"ContainerStarted","Data":"70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1"} Apr 16 16:59:44.317259 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:44.317207 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" podStartSLOduration=7.3171889629999995 podStartE2EDuration="7.317188963s" podCreationTimestamp="2026-04-16 16:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:59:44.315259456 +0000 UTC m=+701.483970830" watchObservedRunningTime="2026-04-16 16:59:44.317188963 +0000 UTC m=+701.485900327" Apr 16 16:59:47.877600 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:47.877513 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:47.877600 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:47.877556 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:47.892521 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:47.892495 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:48.333807 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:48.333725 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 16:59:48.661213 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:48.661172 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 16:59:58.661604 ip-10-0-137-126 kubenswrapper[2572]: I0416 16:59:58.661542 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 17:00:08.661142 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:08.661097 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 17:00:11.289909 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.289862 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t"] Apr 16 17:00:11.290333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.290254 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" podUID="784ba4f2-722d-4fc0-8ffe-c19a488607b2" containerName="main" containerID="cri-o://70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1" gracePeriod=30 Apr 16 17:00:11.294379 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.294352 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7"] Apr 16 17:00:11.699875 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.699848 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 17:00:11.825607 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.825576 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-dshm\") pod \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " Apr 16 17:00:11.825777 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.825614 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kserve-provision-location\") pod \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " Apr 16 17:00:11.825777 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.825655 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-home\") pod \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " Apr 16 17:00:11.825777 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.825680 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-model-cache\") pod \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " Apr 16 17:00:11.825777 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.825726 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/784ba4f2-722d-4fc0-8ffe-c19a488607b2-tls-certs\") pod \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " Apr 16 17:00:11.825777 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.825757 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfzzp\" (UniqueName: \"kubernetes.io/projected/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kube-api-access-lfzzp\") pod \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\" (UID: \"784ba4f2-722d-4fc0-8ffe-c19a488607b2\") " Apr 16 17:00:11.826029 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.825957 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-home" (OuterVolumeSpecName: "home") pod "784ba4f2-722d-4fc0-8ffe-c19a488607b2" (UID: "784ba4f2-722d-4fc0-8ffe-c19a488607b2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:11.826029 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.825993 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-model-cache" (OuterVolumeSpecName: "model-cache") pod "784ba4f2-722d-4fc0-8ffe-c19a488607b2" (UID: "784ba4f2-722d-4fc0-8ffe-c19a488607b2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:11.828023 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.827981 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784ba4f2-722d-4fc0-8ffe-c19a488607b2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "784ba4f2-722d-4fc0-8ffe-c19a488607b2" (UID: "784ba4f2-722d-4fc0-8ffe-c19a488607b2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:00:11.828023 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.828005 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kube-api-access-lfzzp" (OuterVolumeSpecName: "kube-api-access-lfzzp") pod "784ba4f2-722d-4fc0-8ffe-c19a488607b2" (UID: "784ba4f2-722d-4fc0-8ffe-c19a488607b2"). InnerVolumeSpecName "kube-api-access-lfzzp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:00:11.828227 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.828122 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-dshm" (OuterVolumeSpecName: "dshm") pod "784ba4f2-722d-4fc0-8ffe-c19a488607b2" (UID: "784ba4f2-722d-4fc0-8ffe-c19a488607b2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:11.885728 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.885689 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "784ba4f2-722d-4fc0-8ffe-c19a488607b2" (UID: "784ba4f2-722d-4fc0-8ffe-c19a488607b2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:11.926737 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.926710 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:11.926737 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.926737 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:11.926914 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.926748 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/784ba4f2-722d-4fc0-8ffe-c19a488607b2-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:11.926914 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.926758 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lfzzp\" (UniqueName: \"kubernetes.io/projected/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kube-api-access-lfzzp\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:11.926914 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.926768 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:11.926914 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:11.926776 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/784ba4f2-722d-4fc0-8ffe-c19a488607b2-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:12.433569 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.433534 2572 generic.go:358] "Generic (PLEG): container finished" podID="784ba4f2-722d-4fc0-8ffe-c19a488607b2" containerID="70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1" exitCode=0 Apr 16 17:00:12.433994 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.433614 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" event={"ID":"784ba4f2-722d-4fc0-8ffe-c19a488607b2","Type":"ContainerDied","Data":"70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1"} Apr 16 17:00:12.433994 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.433625 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" Apr 16 17:00:12.433994 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.433649 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t" event={"ID":"784ba4f2-722d-4fc0-8ffe-c19a488607b2","Type":"ContainerDied","Data":"caf52847a66e09a4338cc5bc342647102c9e402d82340acfa76d0ff8225afb9f"} Apr 16 17:00:12.433994 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.433672 2572 scope.go:117] "RemoveContainer" containerID="70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1" Apr 16 17:00:12.436185 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.436158 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" event={"ID":"ea556167-a6d8-4909-8c64-84c6894c322b","Type":"ContainerStarted","Data":"917c7f35a28a5ac5e67dab3a3177756d58a395f38c5e390b94d8687489ea1ee8"} Apr 16 17:00:12.436307 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.436262 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="main" containerID="cri-o://7fdfa3cbfe32e440a9327f5776e2d96b4927e3501ab40f2b093e226b4bdc9d6e" gracePeriod=30 Apr 16 17:00:12.436373 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.436302 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="tokenizer" containerID="cri-o://917c7f35a28a5ac5e67dab3a3177756d58a395f38c5e390b94d8687489ea1ee8" gracePeriod=30 Apr 16 17:00:12.436430 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.436371 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 17:00:12.439623 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.439565 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 17:00:12.445096 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.445072 2572 scope.go:117] "RemoveContainer" containerID="c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258" Apr 16 17:00:12.457742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.457716 2572 scope.go:117] "RemoveContainer" containerID="70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1" Apr 16 17:00:12.458080 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:00:12.458044 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1\": container with ID starting with 70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1 not found: ID does not exist" containerID="70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1" Apr 16 17:00:12.458194 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.458164 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1"} err="failed to get container status \"70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1\": rpc error: code = NotFound desc = could not find container \"70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1\": container with ID starting with 70d05a3ad68d19b51fe6230386fc8bbefda45793e17ce95c94ac009866c03df1 not found: ID does not exist" Apr 16 17:00:12.458267 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.458200 2572 scope.go:117] "RemoveContainer" containerID="c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258" Apr 16 17:00:12.458501 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:00:12.458478 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258\": container with ID starting with c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258 not found: ID does not exist" containerID="c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258" Apr 16 17:00:12.458607 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.458504 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258"} err="failed to get container status \"c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258\": rpc error: code = NotFound desc = could not find container \"c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258\": container with ID starting with c3d7aea5dcf06d2802eeea93a48244bf94d096eae8693819da245b66aaf2c258 not found: ID does not exist" Apr 16 17:00:12.461508 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.461457 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" podStartSLOduration=2.5433488 podStartE2EDuration="35.46144226s" podCreationTimestamp="2026-04-16 16:59:37 +0000 UTC" firstStartedPulling="2026-04-16 16:59:39.267757078 +0000 UTC m=+696.436468420" lastFinishedPulling="2026-04-16 17:00:12.185850529 +0000 UTC m=+729.354561880" observedRunningTime="2026-04-16 17:00:12.45624829 +0000 UTC m=+729.624959656" watchObservedRunningTime="2026-04-16 17:00:12.46144226 +0000 UTC m=+729.630153639" Apr 16 17:00:12.476455 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.476389 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t"] Apr 16 17:00:12.483580 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:12.483550 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7c644fd568-xlr5t"] Apr 16 17:00:13.444477 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:13.444440 2572 generic.go:358] "Generic (PLEG): container finished" podID="ea556167-a6d8-4909-8c64-84c6894c322b" containerID="7fdfa3cbfe32e440a9327f5776e2d96b4927e3501ab40f2b093e226b4bdc9d6e" exitCode=0 Apr 16 17:00:13.444935 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:13.444510 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" event={"ID":"ea556167-a6d8-4909-8c64-84c6894c322b","Type":"ContainerDied","Data":"7fdfa3cbfe32e440a9327f5776e2d96b4927e3501ab40f2b093e226b4bdc9d6e"} Apr 16 17:00:13.591632 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:13.591594 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784ba4f2-722d-4fc0-8ffe-c19a488607b2" path="/var/lib/kubelet/pods/784ba4f2-722d-4fc0-8ffe-c19a488607b2/volumes" Apr 16 17:00:18.083273 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:18.083227 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 17:00:18.661441 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:18.661381 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 17:00:22.436883 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:00:22.436852 2572 logging.go:55] [core] [Channel #20 SubChannel #21]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.52:9003", ServerName: "10.133.0.52:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.52:9003: connect: connection refused" Apr 16 17:00:23.437761 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:23.437708 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.52:9003\" within 1s: context deadline exceeded" Apr 16 17:00:23.437761 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:00:23.437748 2572 logging.go:55] [core] [Channel #20 SubChannel #21]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.52:9003", ServerName: "10.133.0.52:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.52:9003: connect: connection refused" Apr 16 17:00:28.660742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:28.660692 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 17:00:31.856226 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:31.856188 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7"] Apr 16 17:00:31.856886 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:31.856860 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="784ba4f2-722d-4fc0-8ffe-c19a488607b2" containerName="storage-initializer" Apr 16 17:00:31.856886 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:31.856888 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="784ba4f2-722d-4fc0-8ffe-c19a488607b2" containerName="storage-initializer" Apr 16 17:00:31.857040 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:31.856902 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="784ba4f2-722d-4fc0-8ffe-c19a488607b2" containerName="main" Apr 16 17:00:31.857040 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:31.856911 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="784ba4f2-722d-4fc0-8ffe-c19a488607b2" containerName="main" Apr 16 17:00:31.857040 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:31.857007 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="784ba4f2-722d-4fc0-8ffe-c19a488607b2" containerName="main" Apr 16 17:00:32.437621 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:00:32.437589 2572 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.52:9003", ServerName: "10.133.0.52:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.52:9003: connect: connection refused" Apr 16 17:00:32.445223 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.445199 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7"] Apr 16 17:00:32.445342 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.445311 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.448247 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.448227 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 17:00:32.612668 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.612633 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x64x\" (UniqueName: \"kubernetes.io/projected/13e4ce25-364d-472a-af15-6a223472a3e3-kube-api-access-4x64x\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.612872 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.612690 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-model-cache\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.612872 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.612757 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-home\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.612872 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.612785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.612872 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.612822 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-dshm\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.612872 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.612853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13e4ce25-364d-472a-af15-6a223472a3e3-tls-certs\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.713793 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.713703 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-model-cache\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.713793 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.713739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-home\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.713793 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.713755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.713793 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.713785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-dshm\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.714139 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.713815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13e4ce25-364d-472a-af15-6a223472a3e3-tls-certs\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.714139 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.713878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4x64x\" (UniqueName: \"kubernetes.io/projected/13e4ce25-364d-472a-af15-6a223472a3e3-kube-api-access-4x64x\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.714248 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.714201 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-home\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.714302 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.714245 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.714381 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.714363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-model-cache\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.716173 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.716147 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-dshm\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.716481 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.716461 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13e4ce25-364d-472a-af15-6a223472a3e3-tls-certs\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.721970 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.721948 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x64x\" (UniqueName: \"kubernetes.io/projected/13e4ce25-364d-472a-af15-6a223472a3e3-kube-api-access-4x64x\") pod \"precise-prefix-cache-test-kserve-7db845475b-rngn7\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.758879 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.758850 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:32.888632 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:32.888607 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7"] Apr 16 17:00:32.890974 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:00:32.890948 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13e4ce25_364d_472a_af15_6a223472a3e3.slice/crio-10f3e2b69cc9e43c4c0f0b48dc3a77d4b468863e7842c169896a5138dbf83b53 WatchSource:0}: Error finding container 10f3e2b69cc9e43c4c0f0b48dc3a77d4b468863e7842c169896a5138dbf83b53: Status 404 returned error can't find the container with id 10f3e2b69cc9e43c4c0f0b48dc3a77d4b468863e7842c169896a5138dbf83b53 Apr 16 17:00:33.437706 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:33.437662 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.52:9003\" within 1s: context deadline exceeded" Apr 16 17:00:33.528729 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:33.528697 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" event={"ID":"13e4ce25-364d-472a-af15-6a223472a3e3","Type":"ContainerStarted","Data":"4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9"} Apr 16 17:00:33.528729 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:33.528734 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" event={"ID":"13e4ce25-364d-472a-af15-6a223472a3e3","Type":"ContainerStarted","Data":"10f3e2b69cc9e43c4c0f0b48dc3a77d4b468863e7842c169896a5138dbf83b53"} Apr 16 17:00:37.548843 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:37.548756 2572 generic.go:358] "Generic (PLEG): container finished" podID="13e4ce25-364d-472a-af15-6a223472a3e3" containerID="4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9" exitCode=0 Apr 16 17:00:37.548843 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:37.548821 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" event={"ID":"13e4ce25-364d-472a-af15-6a223472a3e3","Type":"ContainerDied","Data":"4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9"} Apr 16 17:00:38.554951 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:38.554916 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" event={"ID":"13e4ce25-364d-472a-af15-6a223472a3e3","Type":"ContainerStarted","Data":"6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9"} Apr 16 17:00:38.577075 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:38.577020 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" podStartSLOduration=7.577007516 podStartE2EDuration="7.577007516s" podCreationTimestamp="2026-04-16 17:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:00:38.574421689 +0000 UTC m=+755.743133053" watchObservedRunningTime="2026-04-16 17:00:38.577007516 +0000 UTC m=+755.745718878" Apr 16 17:00:38.661145 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:38.661108 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 17:00:42.437276 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:00:42.437247 2572 logging.go:55] [core] [Channel #24 SubChannel #25]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.52:9003", ServerName: "10.133.0.52:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.52:9003: connect: connection refused" Apr 16 17:00:42.571522 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:42.571494 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7_ea556167-a6d8-4909-8c64-84c6894c322b/tokenizer/0.log" Apr 16 17:00:42.572200 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:42.572178 2572 generic.go:358] "Generic (PLEG): container finished" podID="ea556167-a6d8-4909-8c64-84c6894c322b" containerID="917c7f35a28a5ac5e67dab3a3177756d58a395f38c5e390b94d8687489ea1ee8" exitCode=137 Apr 16 17:00:42.572278 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:42.572248 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" event={"ID":"ea556167-a6d8-4909-8c64-84c6894c322b","Type":"ContainerDied","Data":"917c7f35a28a5ac5e67dab3a3177756d58a395f38c5e390b94d8687489ea1ee8"} Apr 16 17:00:42.759049 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:42.758956 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:42.759049 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:42.758999 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:42.771458 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:42.771432 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:43.085784 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.085764 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7_ea556167-a6d8-4909-8c64-84c6894c322b/tokenizer/0.log" Apr 16 17:00:43.086459 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.086435 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 17:00:43.097369 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.097348 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea556167-a6d8-4909-8c64-84c6894c322b-tls-certs\") pod \"ea556167-a6d8-4909-8c64-84c6894c322b\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " Apr 16 17:00:43.097474 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.097379 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-tmp\") pod \"ea556167-a6d8-4909-8c64-84c6894c322b\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " Apr 16 17:00:43.097474 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.097404 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psc5r\" (UniqueName: \"kubernetes.io/projected/ea556167-a6d8-4909-8c64-84c6894c322b-kube-api-access-psc5r\") pod \"ea556167-a6d8-4909-8c64-84c6894c322b\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " Apr 16 17:00:43.097474 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.097434 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-kserve-provision-location\") pod \"ea556167-a6d8-4909-8c64-84c6894c322b\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " Apr 16 17:00:43.097474 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.097457 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-uds\") pod \"ea556167-a6d8-4909-8c64-84c6894c322b\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " Apr 16 17:00:43.097681 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.097518 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-cache\") pod \"ea556167-a6d8-4909-8c64-84c6894c322b\" (UID: \"ea556167-a6d8-4909-8c64-84c6894c322b\") " Apr 16 17:00:43.097757 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.097728 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ea556167-a6d8-4909-8c64-84c6894c322b" (UID: "ea556167-a6d8-4909-8c64-84c6894c322b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:43.097757 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.097747 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ea556167-a6d8-4909-8c64-84c6894c322b" (UID: "ea556167-a6d8-4909-8c64-84c6894c322b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:43.097891 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.097871 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ea556167-a6d8-4909-8c64-84c6894c322b" (UID: "ea556167-a6d8-4909-8c64-84c6894c322b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:43.098211 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.098182 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ea556167-a6d8-4909-8c64-84c6894c322b" (UID: "ea556167-a6d8-4909-8c64-84c6894c322b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:43.099462 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.099441 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea556167-a6d8-4909-8c64-84c6894c322b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ea556167-a6d8-4909-8c64-84c6894c322b" (UID: "ea556167-a6d8-4909-8c64-84c6894c322b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:00:43.099554 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.099465 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea556167-a6d8-4909-8c64-84c6894c322b-kube-api-access-psc5r" (OuterVolumeSpecName: "kube-api-access-psc5r") pod "ea556167-a6d8-4909-8c64-84c6894c322b" (UID: "ea556167-a6d8-4909-8c64-84c6894c322b"). InnerVolumeSpecName "kube-api-access-psc5r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:00:43.198729 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.198698 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:43.198729 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.198723 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-uds\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:43.198729 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.198734 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:43.198943 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.198744 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea556167-a6d8-4909-8c64-84c6894c322b-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:43.198943 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.198755 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea556167-a6d8-4909-8c64-84c6894c322b-tokenizer-tmp\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:43.198943 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.198765 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-psc5r\" (UniqueName: \"kubernetes.io/projected/ea556167-a6d8-4909-8c64-84c6894c322b-kube-api-access-psc5r\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:00:43.437566 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.437527 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.52:9003\" within 1s: context deadline exceeded" Apr 16 17:00:43.577677 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.577648 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7_ea556167-a6d8-4909-8c64-84c6894c322b/tokenizer/0.log" Apr 16 17:00:43.578414 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.578393 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" Apr 16 17:00:43.578414 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.578395 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7" event={"ID":"ea556167-a6d8-4909-8c64-84c6894c322b","Type":"ContainerDied","Data":"538256d9e69a4e64bfe648d3713a187bb6e46cd34157e7aa0f037293a18735c5"} Apr 16 17:00:43.578592 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.578451 2572 scope.go:117] "RemoveContainer" containerID="917c7f35a28a5ac5e67dab3a3177756d58a395f38c5e390b94d8687489ea1ee8" Apr 16 17:00:43.589453 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.589432 2572 scope.go:117] "RemoveContainer" containerID="7fdfa3cbfe32e440a9327f5776e2d96b4927e3501ab40f2b093e226b4bdc9d6e" Apr 16 17:00:43.593327 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.593286 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:00:43.598231 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.598217 2572 scope.go:117] "RemoveContainer" containerID="9517175bc439ce04fa674b04be31ff52c6cdd919afcbe99226385bd234fb4139" Apr 16 17:00:43.605921 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.605901 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7"] Apr 16 17:00:43.609111 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:43.609090 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4f9kcx7"] Apr 16 17:00:45.589893 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:45.589856 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" path="/var/lib/kubelet/pods/ea556167-a6d8-4909-8c64-84c6894c322b/volumes" Apr 16 17:00:48.660930 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:48.660895 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 17:00:58.660884 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:00:58.660834 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 17:01:05.285016 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.284982 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7"] Apr 16 17:01:05.285459 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.285260 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" podUID="13e4ce25-364d-472a-af15-6a223472a3e3" containerName="main" containerID="cri-o://6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9" gracePeriod=30 Apr 16 17:01:05.556319 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.556284 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:01:05.669453 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.669415 2572 generic.go:358] "Generic (PLEG): container finished" podID="13e4ce25-364d-472a-af15-6a223472a3e3" containerID="6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9" exitCode=0 Apr 16 17:01:05.669608 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.669487 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" Apr 16 17:01:05.669608 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.669496 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" event={"ID":"13e4ce25-364d-472a-af15-6a223472a3e3","Type":"ContainerDied","Data":"6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9"} Apr 16 17:01:05.669608 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.669532 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7" event={"ID":"13e4ce25-364d-472a-af15-6a223472a3e3","Type":"ContainerDied","Data":"10f3e2b69cc9e43c4c0f0b48dc3a77d4b468863e7842c169896a5138dbf83b53"} Apr 16 17:01:05.669608 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.669549 2572 scope.go:117] "RemoveContainer" containerID="6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9" Apr 16 17:01:05.678562 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.678546 2572 scope.go:117] "RemoveContainer" containerID="4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9" Apr 16 17:01:05.701845 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.701818 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-kserve-provision-location\") pod \"13e4ce25-364d-472a-af15-6a223472a3e3\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " Apr 16 17:01:05.702028 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.701854 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-dshm\") pod \"13e4ce25-364d-472a-af15-6a223472a3e3\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " Apr 16 17:01:05.702028 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.701884 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13e4ce25-364d-472a-af15-6a223472a3e3-tls-certs\") pod \"13e4ce25-364d-472a-af15-6a223472a3e3\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " Apr 16 17:01:05.702028 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.701906 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x64x\" (UniqueName: \"kubernetes.io/projected/13e4ce25-364d-472a-af15-6a223472a3e3-kube-api-access-4x64x\") pod \"13e4ce25-364d-472a-af15-6a223472a3e3\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " Apr 16 17:01:05.702335 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.702292 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-home\") pod \"13e4ce25-364d-472a-af15-6a223472a3e3\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " Apr 16 17:01:05.702412 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.702387 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-model-cache\") pod \"13e4ce25-364d-472a-af15-6a223472a3e3\" (UID: \"13e4ce25-364d-472a-af15-6a223472a3e3\") " Apr 16 17:01:05.702641 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.702602 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-home" (OuterVolumeSpecName: "home") pod "13e4ce25-364d-472a-af15-6a223472a3e3" (UID: "13e4ce25-364d-472a-af15-6a223472a3e3"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:05.702730 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.702660 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-model-cache" (OuterVolumeSpecName: "model-cache") pod "13e4ce25-364d-472a-af15-6a223472a3e3" (UID: "13e4ce25-364d-472a-af15-6a223472a3e3"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:05.702843 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.702813 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:01:05.702905 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.702850 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:01:05.704334 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.704306 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13e4ce25-364d-472a-af15-6a223472a3e3-kube-api-access-4x64x" (OuterVolumeSpecName: "kube-api-access-4x64x") pod "13e4ce25-364d-472a-af15-6a223472a3e3" (UID: "13e4ce25-364d-472a-af15-6a223472a3e3"). InnerVolumeSpecName "kube-api-access-4x64x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:01:05.704464 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.704413 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13e4ce25-364d-472a-af15-6a223472a3e3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "13e4ce25-364d-472a-af15-6a223472a3e3" (UID: "13e4ce25-364d-472a-af15-6a223472a3e3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:01:05.704635 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.704609 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-dshm" (OuterVolumeSpecName: "dshm") pod "13e4ce25-364d-472a-af15-6a223472a3e3" (UID: "13e4ce25-364d-472a-af15-6a223472a3e3"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:05.749119 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.749087 2572 scope.go:117] "RemoveContainer" containerID="6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9" Apr 16 17:01:05.749493 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:01:05.749465 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9\": container with ID starting with 6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9 not found: ID does not exist" containerID="6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9" Apr 16 17:01:05.749546 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.749506 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9"} err="failed to get container status \"6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9\": rpc error: code = NotFound desc = could not find container \"6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9\": container with ID starting with 6ef8e0b32dc2382332913a4654dcb61d75d4edc992a5972ed116b3a056e13fc9 not found: ID does not exist" Apr 16 17:01:05.749546 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.749527 2572 scope.go:117] "RemoveContainer" containerID="4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9" Apr 16 17:01:05.749810 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:01:05.749782 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9\": container with ID starting with 4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9 not found: ID does not exist" containerID="4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9" Apr 16 17:01:05.749894 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.749810 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9"} err="failed to get container status \"4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9\": rpc error: code = NotFound desc = could not find container \"4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9\": container with ID starting with 4bde44a4c066b8d07f9e396d6007c95f8e5a8df57737b891b145d4491dcb5bb9 not found: ID does not exist" Apr 16 17:01:05.763970 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.763940 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "13e4ce25-364d-472a-af15-6a223472a3e3" (UID: "13e4ce25-364d-472a-af15-6a223472a3e3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:05.803371 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.803299 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:01:05.803371 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.803325 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/13e4ce25-364d-472a-af15-6a223472a3e3-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:01:05.803371 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.803336 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13e4ce25-364d-472a-af15-6a223472a3e3-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:01:05.803371 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:05.803346 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4x64x\" (UniqueName: \"kubernetes.io/projected/13e4ce25-364d-472a-af15-6a223472a3e3-kube-api-access-4x64x\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:01:06.011260 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:06.011215 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7"] Apr 16 17:01:06.023734 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:06.023705 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-rngn7"] Apr 16 17:01:07.590928 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:07.590891 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13e4ce25-364d-472a-af15-6a223472a3e3" path="/var/lib/kubelet/pods/13e4ce25-364d-472a-af15-6a223472a3e3/volumes" Apr 16 17:01:08.670663 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:08.670631 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 17:01:08.678351 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:08.678323 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 17:01:18.257054 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.256962 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46"] Apr 16 17:01:18.257418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257329 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13e4ce25-364d-472a-af15-6a223472a3e3" containerName="storage-initializer" Apr 16 17:01:18.257418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257340 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e4ce25-364d-472a-af15-6a223472a3e3" containerName="storage-initializer" Apr 16 17:01:18.257418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257350 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13e4ce25-364d-472a-af15-6a223472a3e3" containerName="main" Apr 16 17:01:18.257418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257357 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e4ce25-364d-472a-af15-6a223472a3e3" containerName="main" Apr 16 17:01:18.257418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257368 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="tokenizer" Apr 16 17:01:18.257418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257374 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="tokenizer" Apr 16 17:01:18.257418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257383 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="storage-initializer" Apr 16 17:01:18.257418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257389 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="storage-initializer" Apr 16 17:01:18.257418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257398 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="main" Apr 16 17:01:18.257418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257404 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="main" Apr 16 17:01:18.257770 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257460 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="13e4ce25-364d-472a-af15-6a223472a3e3" containerName="main" Apr 16 17:01:18.257770 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257471 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="tokenizer" Apr 16 17:01:18.257770 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.257477 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea556167-a6d8-4909-8c64-84c6894c322b" containerName="main" Apr 16 17:01:18.263271 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.263249 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.266160 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.266137 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-round-trip-kserve-self-signed-certs\"" Apr 16 17:01:18.272242 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.272221 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46"] Apr 16 17:01:18.417521 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.417495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-model-cache\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.417671 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.417529 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7rd2\" (UniqueName: \"kubernetes.io/projected/d595bc0e-6518-4c7f-94f0-909862c15dd1-kube-api-access-q7rd2\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.417671 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.417568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-dshm\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.417671 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.417632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d595bc0e-6518-4c7f-94f0-909862c15dd1-tls-certs\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.417779 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.417679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-home\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.417779 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.417706 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-kserve-provision-location\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.518437 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.518367 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d595bc0e-6518-4c7f-94f0-909862c15dd1-tls-certs\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.518437 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.518401 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-home\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.518437 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.518423 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-kserve-provision-location\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.518692 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.518453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-model-cache\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.518692 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.518484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7rd2\" (UniqueName: \"kubernetes.io/projected/d595bc0e-6518-4c7f-94f0-909862c15dd1-kube-api-access-q7rd2\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.518692 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.518548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-dshm\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.518848 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.518739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-home\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.518848 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.518801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-kserve-provision-location\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.518950 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.518848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-model-cache\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.520673 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.520656 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-dshm\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.520756 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.520741 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d595bc0e-6518-4c7f-94f0-909862c15dd1-tls-certs\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.529806 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.529785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7rd2\" (UniqueName: \"kubernetes.io/projected/d595bc0e-6518-4c7f-94f0-909862c15dd1-kube-api-access-q7rd2\") pod \"conv-test-round-trip-kserve-fb84c864-t4g46\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.574830 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.574804 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:18.911188 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:18.911158 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46"] Apr 16 17:01:18.911366 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:01:18.911340 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd595bc0e_6518_4c7f_94f0_909862c15dd1.slice/crio-3a394cbe0b501987e5d42256d9cdcfee27e5ef1fd70a7f087397cea29b8b6e9a WatchSource:0}: Error finding container 3a394cbe0b501987e5d42256d9cdcfee27e5ef1fd70a7f087397cea29b8b6e9a: Status 404 returned error can't find the container with id 3a394cbe0b501987e5d42256d9cdcfee27e5ef1fd70a7f087397cea29b8b6e9a Apr 16 17:01:19.733868 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:19.733834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" event={"ID":"d595bc0e-6518-4c7f-94f0-909862c15dd1","Type":"ContainerStarted","Data":"055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f"} Apr 16 17:01:19.733868 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:19.733872 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" event={"ID":"d595bc0e-6518-4c7f-94f0-909862c15dd1","Type":"ContainerStarted","Data":"3a394cbe0b501987e5d42256d9cdcfee27e5ef1fd70a7f087397cea29b8b6e9a"} Apr 16 17:01:23.749471 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:23.749436 2572 generic.go:358] "Generic (PLEG): container finished" podID="d595bc0e-6518-4c7f-94f0-909862c15dd1" containerID="055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f" exitCode=0 Apr 16 17:01:23.749867 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:23.749510 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" event={"ID":"d595bc0e-6518-4c7f-94f0-909862c15dd1","Type":"ContainerDied","Data":"055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f"} Apr 16 17:01:24.754719 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:24.754685 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" event={"ID":"d595bc0e-6518-4c7f-94f0-909862c15dd1","Type":"ContainerStarted","Data":"4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2"} Apr 16 17:01:24.781816 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:24.781748 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" podStartSLOduration=6.781730489 podStartE2EDuration="6.781730489s" podCreationTimestamp="2026-04-16 17:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:01:24.776280882 +0000 UTC m=+801.944992272" watchObservedRunningTime="2026-04-16 17:01:24.781730489 +0000 UTC m=+801.950441856" Apr 16 17:01:28.575116 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:28.575083 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:28.575531 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:28.575150 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:01:28.576771 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:28.576746 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" podUID="d595bc0e-6518-4c7f-94f0-909862c15dd1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 16 17:01:38.575490 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:38.575436 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" podUID="d595bc0e-6518-4c7f-94f0-909862c15dd1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 16 17:01:40.720149 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.720108 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz"] Apr 16 17:01:40.726348 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.726324 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.729117 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.729089 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 17:01:40.735017 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.734105 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz"] Apr 16 17:01:40.814218 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.814188 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-kserve-provision-location\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.814426 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.814231 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-model-cache\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.814426 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.814275 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-dshm\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.814426 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.814306 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-home\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.814426 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.814368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/98c64805-15f2-4d73-bf1f-dc871c5eb243-tls-certs\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.814426 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.814390 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vljl8\" (UniqueName: \"kubernetes.io/projected/98c64805-15f2-4d73-bf1f-dc871c5eb243-kube-api-access-vljl8\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.915163 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.915132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-kserve-provision-location\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.915354 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.915172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-model-cache\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.915354 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.915218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-dshm\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.915354 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.915245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-home\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.915354 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.915291 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/98c64805-15f2-4d73-bf1f-dc871c5eb243-tls-certs\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.915354 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.915315 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vljl8\" (UniqueName: \"kubernetes.io/projected/98c64805-15f2-4d73-bf1f-dc871c5eb243-kube-api-access-vljl8\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.915664 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.915638 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-kserve-provision-location\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.915732 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.915700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-model-cache\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.916021 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.915998 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-home\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.917698 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.917666 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-dshm\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.918004 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.917987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/98c64805-15f2-4d73-bf1f-dc871c5eb243-tls-certs\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:40.928363 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:40.928338 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vljl8\" (UniqueName: \"kubernetes.io/projected/98c64805-15f2-4d73-bf1f-dc871c5eb243-kube-api-access-vljl8\") pod \"stop-feature-test-kserve-5698dc8fcf-zspvz\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:41.041496 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:41.041341 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:41.194585 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:41.194559 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz"] Apr 16 17:01:41.196979 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:01:41.196948 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98c64805_15f2_4d73_bf1f_dc871c5eb243.slice/crio-b42aa54d1579628c6ce6164d9d892f0fdf880bc03e599ae4c3dcbc059f0128ff WatchSource:0}: Error finding container b42aa54d1579628c6ce6164d9d892f0fdf880bc03e599ae4c3dcbc059f0128ff: Status 404 returned error can't find the container with id b42aa54d1579628c6ce6164d9d892f0fdf880bc03e599ae4c3dcbc059f0128ff Apr 16 17:01:41.822461 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:41.822426 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" event={"ID":"98c64805-15f2-4d73-bf1f-dc871c5eb243","Type":"ContainerStarted","Data":"58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435"} Apr 16 17:01:41.822461 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:41.822462 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" event={"ID":"98c64805-15f2-4d73-bf1f-dc871c5eb243","Type":"ContainerStarted","Data":"b42aa54d1579628c6ce6164d9d892f0fdf880bc03e599ae4c3dcbc059f0128ff"} Apr 16 17:01:44.031791 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:44.031742 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46"] Apr 16 17:01:44.032375 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:44.032124 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" podUID="d595bc0e-6518-4c7f-94f0-909862c15dd1" containerName="main" containerID="cri-o://4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2" gracePeriod=30 Apr 16 17:01:45.842629 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:45.842531 2572 generic.go:358] "Generic (PLEG): container finished" podID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerID="58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435" exitCode=0 Apr 16 17:01:45.842629 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:45.842598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" event={"ID":"98c64805-15f2-4d73-bf1f-dc871c5eb243","Type":"ContainerDied","Data":"58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435"} Apr 16 17:01:46.849109 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:46.849046 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" event={"ID":"98c64805-15f2-4d73-bf1f-dc871c5eb243","Type":"ContainerStarted","Data":"983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757"} Apr 16 17:01:46.874877 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:46.874816 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podStartSLOduration=6.874795534 podStartE2EDuration="6.874795534s" podCreationTimestamp="2026-04-16 17:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:01:46.871468176 +0000 UTC m=+824.040179539" watchObservedRunningTime="2026-04-16 17:01:46.874795534 +0000 UTC m=+824.043506897" Apr 16 17:01:51.042516 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:51.042471 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:51.042516 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:51.042521 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:01:51.044472 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:51.044446 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 16 17:01:52.296246 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:52.296203 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd"] Apr 16 17:01:52.296698 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:52.296619 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" containerID="cri-o://200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747" gracePeriod=30 Apr 16 17:01:59.851378 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:59.851342 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb"] Apr 16 17:01:59.854941 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:59.854912 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:01:59.857774 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:59.857751 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 17:01:59.863301 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:01:59.863278 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb"] Apr 16 17:02:00.011400 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.011353 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-dshm\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.011400 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.011403 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-model-cache\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.011642 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.011428 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-home\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.011642 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.011462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4xcr\" (UniqueName: \"kubernetes.io/projected/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kube-api-access-p4xcr\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.011642 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.011613 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0a5eaa-4271-403b-8817-bb12bea1f93c-tls-certs\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.011804 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.011646 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.112846 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.112760 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0a5eaa-4271-403b-8817-bb12bea1f93c-tls-certs\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.112846 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.112803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.112846 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.112830 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-dshm\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.113158 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.112854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-model-cache\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.113158 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.112879 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-home\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.113158 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.112900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4xcr\" (UniqueName: \"kubernetes.io/projected/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kube-api-access-p4xcr\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.113311 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.113284 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.113510 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.113489 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-model-cache\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.113630 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.113603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-home\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.115569 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.115537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-dshm\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.115727 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.115707 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0a5eaa-4271-403b-8817-bb12bea1f93c-tls-certs\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.121651 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.121621 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4xcr\" (UniqueName: \"kubernetes.io/projected/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kube-api-access-p4xcr\") pod \"custom-route-timeout-test-kserve-968f7c9f5-dcrlb\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.172239 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.172203 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:00.324477 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.324442 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb"] Apr 16 17:02:00.327271 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:02:00.327219 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e0a5eaa_4271_403b_8817_bb12bea1f93c.slice/crio-b788e0f0ad4b47455add75f92df3c4968c48282dbe766d55e3b9be2b64abdcff WatchSource:0}: Error finding container b788e0f0ad4b47455add75f92df3c4968c48282dbe766d55e3b9be2b64abdcff: Status 404 returned error can't find the container with id b788e0f0ad4b47455add75f92df3c4968c48282dbe766d55e3b9be2b64abdcff Apr 16 17:02:00.910097 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.910043 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" event={"ID":"2e0a5eaa-4271-403b-8817-bb12bea1f93c","Type":"ContainerStarted","Data":"3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6"} Apr 16 17:02:00.910533 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:00.910103 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" event={"ID":"2e0a5eaa-4271-403b-8817-bb12bea1f93c","Type":"ContainerStarted","Data":"b788e0f0ad4b47455add75f92df3c4968c48282dbe766d55e3b9be2b64abdcff"} Apr 16 17:02:01.041827 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:01.041780 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 16 17:02:04.939375 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:04.939275 2572 generic.go:358] "Generic (PLEG): container finished" podID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerID="3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6" exitCode=0 Apr 16 17:02:04.939375 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:04.939347 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" event={"ID":"2e0a5eaa-4271-403b-8817-bb12bea1f93c","Type":"ContainerDied","Data":"3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6"} Apr 16 17:02:05.945088 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:05.945024 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" event={"ID":"2e0a5eaa-4271-403b-8817-bb12bea1f93c","Type":"ContainerStarted","Data":"efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9"} Apr 16 17:02:05.970624 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:05.970571 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podStartSLOduration=6.970558275 podStartE2EDuration="6.970558275s" podCreationTimestamp="2026-04-16 17:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:02:05.967941509 +0000 UTC m=+843.136652871" watchObservedRunningTime="2026-04-16 17:02:05.970558275 +0000 UTC m=+843.139269638" Apr 16 17:02:10.172659 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:10.172617 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:10.173109 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:10.172675 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:02:10.174594 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:10.174559 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 16 17:02:11.042590 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:11.042535 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 16 17:02:14.182593 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:02:14.182558 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e0a5eaa_4271_403b_8817_bb12bea1f93c.slice/crio-3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd595bc0e_6518_4c7f_94f0_909862c15dd1.slice/crio-4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e0a5eaa_4271_403b_8817_bb12bea1f93c.slice/crio-conmon-3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd595bc0e_6518_4c7f_94f0_909862c15dd1.slice/crio-conmon-4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2.scope\": RecentStats: unable to find data in memory cache]" Apr 16 17:02:14.182593 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:02:14.182568 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e0a5eaa_4271_403b_8817_bb12bea1f93c.slice/crio-conmon-3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd595bc0e_6518_4c7f_94f0_909862c15dd1.slice/crio-conmon-4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd595bc0e_6518_4c7f_94f0_909862c15dd1.slice/crio-4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e0a5eaa_4271_403b_8817_bb12bea1f93c.slice/crio-3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6.scope\": RecentStats: unable to find data in memory cache]" Apr 16 17:02:14.318312 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.318278 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-fb84c864-t4g46_d595bc0e-6518-4c7f-94f0-909862c15dd1/main/0.log" Apr 16 17:02:14.318683 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.318667 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:02:14.355371 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.355345 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-home\") pod \"d595bc0e-6518-4c7f-94f0-909862c15dd1\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " Apr 16 17:02:14.355523 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.355394 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d595bc0e-6518-4c7f-94f0-909862c15dd1-tls-certs\") pod \"d595bc0e-6518-4c7f-94f0-909862c15dd1\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " Apr 16 17:02:14.355523 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.355422 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-model-cache\") pod \"d595bc0e-6518-4c7f-94f0-909862c15dd1\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " Apr 16 17:02:14.355523 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.355472 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7rd2\" (UniqueName: \"kubernetes.io/projected/d595bc0e-6518-4c7f-94f0-909862c15dd1-kube-api-access-q7rd2\") pod \"d595bc0e-6518-4c7f-94f0-909862c15dd1\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " Apr 16 17:02:14.355676 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.355525 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-dshm\") pod \"d595bc0e-6518-4c7f-94f0-909862c15dd1\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " Apr 16 17:02:14.355676 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.355586 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-kserve-provision-location\") pod \"d595bc0e-6518-4c7f-94f0-909862c15dd1\" (UID: \"d595bc0e-6518-4c7f-94f0-909862c15dd1\") " Apr 16 17:02:14.355676 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.355644 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-model-cache" (OuterVolumeSpecName: "model-cache") pod "d595bc0e-6518-4c7f-94f0-909862c15dd1" (UID: "d595bc0e-6518-4c7f-94f0-909862c15dd1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:02:14.355824 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.355705 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-home" (OuterVolumeSpecName: "home") pod "d595bc0e-6518-4c7f-94f0-909862c15dd1" (UID: "d595bc0e-6518-4c7f-94f0-909862c15dd1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:02:14.355877 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.355847 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:14.355877 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.355865 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:14.358139 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.358114 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-dshm" (OuterVolumeSpecName: "dshm") pod "d595bc0e-6518-4c7f-94f0-909862c15dd1" (UID: "d595bc0e-6518-4c7f-94f0-909862c15dd1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:02:14.358239 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.358157 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d595bc0e-6518-4c7f-94f0-909862c15dd1-kube-api-access-q7rd2" (OuterVolumeSpecName: "kube-api-access-q7rd2") pod "d595bc0e-6518-4c7f-94f0-909862c15dd1" (UID: "d595bc0e-6518-4c7f-94f0-909862c15dd1"). InnerVolumeSpecName "kube-api-access-q7rd2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:02:14.358444 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.358425 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d595bc0e-6518-4c7f-94f0-909862c15dd1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d595bc0e-6518-4c7f-94f0-909862c15dd1" (UID: "d595bc0e-6518-4c7f-94f0-909862c15dd1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:02:14.378978 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.378944 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d595bc0e-6518-4c7f-94f0-909862c15dd1" (UID: "d595bc0e-6518-4c7f-94f0-909862c15dd1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:02:14.456887 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.456855 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:14.456887 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.456885 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d595bc0e-6518-4c7f-94f0-909862c15dd1-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:14.457048 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.456899 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q7rd2\" (UniqueName: \"kubernetes.io/projected/d595bc0e-6518-4c7f-94f0-909862c15dd1-kube-api-access-q7rd2\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:14.457048 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.456911 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d595bc0e-6518-4c7f-94f0-909862c15dd1-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:14.998821 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.998793 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-fb84c864-t4g46_d595bc0e-6518-4c7f-94f0-909862c15dd1/main/0.log" Apr 16 17:02:14.999196 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.999166 2572 generic.go:358] "Generic (PLEG): container finished" podID="d595bc0e-6518-4c7f-94f0-909862c15dd1" containerID="4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2" exitCode=137 Apr 16 17:02:14.999278 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.999230 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" Apr 16 17:02:14.999278 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.999248 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" event={"ID":"d595bc0e-6518-4c7f-94f0-909862c15dd1","Type":"ContainerDied","Data":"4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2"} Apr 16 17:02:14.999364 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.999292 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46" event={"ID":"d595bc0e-6518-4c7f-94f0-909862c15dd1","Type":"ContainerDied","Data":"3a394cbe0b501987e5d42256d9cdcfee27e5ef1fd70a7f087397cea29b8b6e9a"} Apr 16 17:02:14.999364 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:14.999314 2572 scope.go:117] "RemoveContainer" containerID="4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2" Apr 16 17:02:15.008104 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:15.008054 2572 scope.go:117] "RemoveContainer" containerID="055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f" Apr 16 17:02:15.018612 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:15.018588 2572 scope.go:117] "RemoveContainer" containerID="4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2" Apr 16 17:02:15.019350 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:02:15.018959 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2\": container with ID starting with 4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2 not found: ID does not exist" containerID="4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2" Apr 16 17:02:15.019350 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:15.018997 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2"} err="failed to get container status \"4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2\": rpc error: code = NotFound desc = could not find container \"4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2\": container with ID starting with 4ffeea82de78318820f1a7dde148c389b6fd7645d21aef0976835bd6f24775e2 not found: ID does not exist" Apr 16 17:02:15.019350 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:15.019022 2572 scope.go:117] "RemoveContainer" containerID="055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f" Apr 16 17:02:15.019578 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:02:15.019413 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f\": container with ID starting with 055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f not found: ID does not exist" containerID="055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f" Apr 16 17:02:15.019578 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:15.019441 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f"} err="failed to get container status \"055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f\": rpc error: code = NotFound desc = could not find container \"055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f\": container with ID starting with 055444c47b1ac27474a82bcdd06e372e4ceb42489bd4e4a65acf8fb1c68e153f not found: ID does not exist" Apr 16 17:02:15.023325 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:15.023272 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46"] Apr 16 17:02:15.025920 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:15.025895 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-fb84c864-t4g46"] Apr 16 17:02:15.598498 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:15.598455 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d595bc0e-6518-4c7f-94f0-909862c15dd1" path="/var/lib/kubelet/pods/d595bc0e-6518-4c7f-94f0-909862c15dd1/volumes" Apr 16 17:02:20.173306 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:20.173269 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 16 17:02:21.042622 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:21.042573 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 16 17:02:22.552909 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.552842 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd_d4689eed-e55a-4551-b7ee-12e706d076aa/main/0.log" Apr 16 17:02:22.553334 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.553236 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 17:02:22.634024 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.633992 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-dshm\") pod \"d4689eed-e55a-4551-b7ee-12e706d076aa\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " Apr 16 17:02:22.634234 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.634047 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-home\") pod \"d4689eed-e55a-4551-b7ee-12e706d076aa\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " Apr 16 17:02:22.634234 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.634089 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4689eed-e55a-4551-b7ee-12e706d076aa-tls-certs\") pod \"d4689eed-e55a-4551-b7ee-12e706d076aa\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " Apr 16 17:02:22.634234 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.634110 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-model-cache\") pod \"d4689eed-e55a-4551-b7ee-12e706d076aa\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " Apr 16 17:02:22.634234 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.634136 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llckq\" (UniqueName: \"kubernetes.io/projected/d4689eed-e55a-4551-b7ee-12e706d076aa-kube-api-access-llckq\") pod \"d4689eed-e55a-4551-b7ee-12e706d076aa\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " Apr 16 17:02:22.634234 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.634174 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-kserve-provision-location\") pod \"d4689eed-e55a-4551-b7ee-12e706d076aa\" (UID: \"d4689eed-e55a-4551-b7ee-12e706d076aa\") " Apr 16 17:02:22.634513 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.634436 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-model-cache" (OuterVolumeSpecName: "model-cache") pod "d4689eed-e55a-4551-b7ee-12e706d076aa" (UID: "d4689eed-e55a-4551-b7ee-12e706d076aa"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:02:22.634513 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.634477 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-home" (OuterVolumeSpecName: "home") pod "d4689eed-e55a-4551-b7ee-12e706d076aa" (UID: "d4689eed-e55a-4551-b7ee-12e706d076aa"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:02:22.636531 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.636491 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-dshm" (OuterVolumeSpecName: "dshm") pod "d4689eed-e55a-4551-b7ee-12e706d076aa" (UID: "d4689eed-e55a-4551-b7ee-12e706d076aa"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:02:22.636826 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.636806 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4689eed-e55a-4551-b7ee-12e706d076aa-kube-api-access-llckq" (OuterVolumeSpecName: "kube-api-access-llckq") pod "d4689eed-e55a-4551-b7ee-12e706d076aa" (UID: "d4689eed-e55a-4551-b7ee-12e706d076aa"). InnerVolumeSpecName "kube-api-access-llckq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:02:22.636925 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.636911 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4689eed-e55a-4551-b7ee-12e706d076aa-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d4689eed-e55a-4551-b7ee-12e706d076aa" (UID: "d4689eed-e55a-4551-b7ee-12e706d076aa"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:02:22.646306 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.646268 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d4689eed-e55a-4551-b7ee-12e706d076aa" (UID: "d4689eed-e55a-4551-b7ee-12e706d076aa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:02:22.735203 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.735160 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:22.735203 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.735188 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4689eed-e55a-4551-b7ee-12e706d076aa-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:22.735203 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.735199 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:22.735203 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.735208 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-llckq\" (UniqueName: \"kubernetes.io/projected/d4689eed-e55a-4551-b7ee-12e706d076aa-kube-api-access-llckq\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:22.735561 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.735218 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:22.735561 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:22.735227 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d4689eed-e55a-4551-b7ee-12e706d076aa-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:02:23.043196 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.043164 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd_d4689eed-e55a-4551-b7ee-12e706d076aa/main/0.log" Apr 16 17:02:23.043575 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.043538 2572 generic.go:358] "Generic (PLEG): container finished" podID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerID="200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747" exitCode=137 Apr 16 17:02:23.043713 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.043573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" event={"ID":"d4689eed-e55a-4551-b7ee-12e706d076aa","Type":"ContainerDied","Data":"200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747"} Apr 16 17:02:23.043713 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.043611 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" event={"ID":"d4689eed-e55a-4551-b7ee-12e706d076aa","Type":"ContainerDied","Data":"4e1368cf281f067a1ff3e2c3213adb3d54813f0461855e9f5b847608bec914b1"} Apr 16 17:02:23.043713 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.043625 2572 scope.go:117] "RemoveContainer" containerID="200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747" Apr 16 17:02:23.043713 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.043639 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd" Apr 16 17:02:23.068993 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.068956 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd"] Apr 16 17:02:23.069492 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.069456 2572 scope.go:117] "RemoveContainer" containerID="2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447" Apr 16 17:02:23.072776 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.072750 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-67b5b4c5dfxsdgd"] Apr 16 17:02:23.085710 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.085687 2572 scope.go:117] "RemoveContainer" containerID="200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747" Apr 16 17:02:23.085989 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:02:23.085969 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747\": container with ID starting with 200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747 not found: ID does not exist" containerID="200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747" Apr 16 17:02:23.086149 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.085997 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747"} err="failed to get container status \"200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747\": rpc error: code = NotFound desc = could not find container \"200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747\": container with ID starting with 200a8cce261196d9f28b0a0fd051c9bc5e1ec751ba761ee83292bba58d130747 not found: ID does not exist" Apr 16 17:02:23.086149 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.086017 2572 scope.go:117] "RemoveContainer" containerID="2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447" Apr 16 17:02:23.086307 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:02:23.086276 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447\": container with ID starting with 2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447 not found: ID does not exist" containerID="2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447" Apr 16 17:02:23.086370 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.086313 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447"} err="failed to get container status \"2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447\": rpc error: code = NotFound desc = could not find container \"2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447\": container with ID starting with 2f8d02e15a6af09f920bba507fa1d6436ad9b857bc9bdd85e5d464ce786dd447 not found: ID does not exist" Apr 16 17:02:23.592836 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:23.592795 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" path="/var/lib/kubelet/pods/d4689eed-e55a-4551-b7ee-12e706d076aa/volumes" Apr 16 17:02:30.173569 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:30.173518 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 16 17:02:31.042773 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:31.042719 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 16 17:02:40.173601 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:40.173548 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 16 17:02:41.042388 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:41.042340 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 16 17:02:50.172760 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:50.172677 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 16 17:02:51.042366 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:02:51.042321 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 16 17:03:00.172800 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:00.172754 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 16 17:03:01.042333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:01.042274 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 16 17:03:03.537617 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:03.537587 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 17:03:03.538586 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:03.538563 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 17:03:10.173446 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:10.173405 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 16 17:03:11.042083 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:11.041968 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 16 17:03:20.173084 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:20.173023 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 16 17:03:21.042752 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:21.042695 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 16 17:03:30.173036 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:30.172972 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 16 17:03:31.042152 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:31.042097 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 16 17:03:40.172829 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:40.172784 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 16 17:03:41.055374 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:41.055339 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:03:41.063765 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:41.063743 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:03:42.640775 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:42.640741 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz"] Apr 16 17:03:42.641258 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:42.641087 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" containerID="cri-o://983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757" gracePeriod=30 Apr 16 17:03:50.172837 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:50.172784 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 16 17:03:51.846571 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.846536 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf"] Apr 16 17:03:51.846974 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.846912 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d595bc0e-6518-4c7f-94f0-909862c15dd1" containerName="main" Apr 16 17:03:51.846974 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.846922 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d595bc0e-6518-4c7f-94f0-909862c15dd1" containerName="main" Apr 16 17:03:51.846974 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.846936 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="storage-initializer" Apr 16 17:03:51.846974 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.846943 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="storage-initializer" Apr 16 17:03:51.846974 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.846949 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" Apr 16 17:03:51.846974 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.846954 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" Apr 16 17:03:51.846974 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.846962 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d595bc0e-6518-4c7f-94f0-909862c15dd1" containerName="storage-initializer" Apr 16 17:03:51.846974 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.846967 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d595bc0e-6518-4c7f-94f0-909862c15dd1" containerName="storage-initializer" Apr 16 17:03:51.847240 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.847029 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4689eed-e55a-4551-b7ee-12e706d076aa" containerName="main" Apr 16 17:03:51.847240 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.847037 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d595bc0e-6518-4c7f-94f0-909862c15dd1" containerName="main" Apr 16 17:03:51.850283 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.850263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:51.859570 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.859546 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf"] Apr 16 17:03:51.909624 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.909572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-tls-certs\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:51.909812 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.909645 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-home\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:51.909812 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.909689 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-model-cache\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:51.909812 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.909723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kserve-provision-location\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:51.909997 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.909844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75dck\" (UniqueName: \"kubernetes.io/projected/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kube-api-access-75dck\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:51.909997 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:51.909886 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-dshm\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.011006 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.010966 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-tls-certs\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.011236 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.011026 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-home\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.011236 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.011103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-model-cache\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.011236 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.011143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kserve-provision-location\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.011424 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.011245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75dck\" (UniqueName: \"kubernetes.io/projected/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kube-api-access-75dck\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.011424 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.011289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-dshm\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.011632 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.011604 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-home\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.012450 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.011797 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kserve-provision-location\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.012596 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.012390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-model-cache\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.014321 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.014282 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-tls-certs\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.014530 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.014510 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-dshm\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.020312 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.020290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75dck\" (UniqueName: \"kubernetes.io/projected/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kube-api-access-75dck\") pod \"stop-feature-test-kserve-5698dc8fcf-ng9bf\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.161628 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.161597 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:03:52.328406 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.328354 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf"] Apr 16 17:03:52.334229 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:03:52.334202 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f6e7d4b_cd5b_43ba_b1b5_95c01c4763c6.slice/crio-360b8965675e566671edb8186305f819bd88d50b6414c788884d16091465f90e WatchSource:0}: Error finding container 360b8965675e566671edb8186305f819bd88d50b6414c788884d16091465f90e: Status 404 returned error can't find the container with id 360b8965675e566671edb8186305f819bd88d50b6414c788884d16091465f90e Apr 16 17:03:52.336655 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.336634 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:03:52.413487 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.413423 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" event={"ID":"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6","Type":"ContainerStarted","Data":"8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c"} Apr 16 17:03:52.413487 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:52.413464 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" event={"ID":"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6","Type":"ContainerStarted","Data":"360b8965675e566671edb8186305f819bd88d50b6414c788884d16091465f90e"} Apr 16 17:03:57.435382 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:57.435347 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerID="8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c" exitCode=0 Apr 16 17:03:57.435763 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:57.435419 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" event={"ID":"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6","Type":"ContainerDied","Data":"8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c"} Apr 16 17:03:58.440939 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:58.440900 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" event={"ID":"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6","Type":"ContainerStarted","Data":"49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b"} Apr 16 17:03:58.463238 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:03:58.463190 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podStartSLOduration=7.463171616 podStartE2EDuration="7.463171616s" podCreationTimestamp="2026-04-16 17:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:03:58.460284113 +0000 UTC m=+955.628995475" watchObservedRunningTime="2026-04-16 17:03:58.463171616 +0000 UTC m=+955.631883013" Apr 16 17:04:00.188390 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:00.188359 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:04:00.200081 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:00.200041 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:04:02.161966 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:02.161924 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:04:02.161966 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:02.161968 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:04:02.163623 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:02.163596 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8000/health\": dial tcp 10.133.0.57:8000: connect: connection refused" Apr 16 17:04:07.868715 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:07.868676 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb"] Apr 16 17:04:07.869200 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:07.869009 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" containerID="cri-o://efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9" gracePeriod=30 Apr 16 17:04:12.161953 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:12.161908 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8000/health\": dial tcp 10.133.0.57:8000: connect: connection refused" Apr 16 17:04:12.941022 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:12.940996 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5698dc8fcf-zspvz_98c64805-15f2-4d73-bf1f-dc871c5eb243/main/0.log" Apr 16 17:04:12.941407 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:12.941390 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:04:13.105589 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.105553 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-dshm\") pod \"98c64805-15f2-4d73-bf1f-dc871c5eb243\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " Apr 16 17:04:13.105770 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.105623 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/98c64805-15f2-4d73-bf1f-dc871c5eb243-tls-certs\") pod \"98c64805-15f2-4d73-bf1f-dc871c5eb243\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " Apr 16 17:04:13.105770 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.105663 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-model-cache\") pod \"98c64805-15f2-4d73-bf1f-dc871c5eb243\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " Apr 16 17:04:13.105770 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.105678 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vljl8\" (UniqueName: \"kubernetes.io/projected/98c64805-15f2-4d73-bf1f-dc871c5eb243-kube-api-access-vljl8\") pod \"98c64805-15f2-4d73-bf1f-dc871c5eb243\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " Apr 16 17:04:13.105770 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.105707 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-home\") pod \"98c64805-15f2-4d73-bf1f-dc871c5eb243\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " Apr 16 17:04:13.105770 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.105762 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-kserve-provision-location\") pod \"98c64805-15f2-4d73-bf1f-dc871c5eb243\" (UID: \"98c64805-15f2-4d73-bf1f-dc871c5eb243\") " Apr 16 17:04:13.106092 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.105916 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-model-cache" (OuterVolumeSpecName: "model-cache") pod "98c64805-15f2-4d73-bf1f-dc871c5eb243" (UID: "98c64805-15f2-4d73-bf1f-dc871c5eb243"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:13.106092 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.106025 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:13.106260 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.106081 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-home" (OuterVolumeSpecName: "home") pod "98c64805-15f2-4d73-bf1f-dc871c5eb243" (UID: "98c64805-15f2-4d73-bf1f-dc871c5eb243"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:13.107720 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.107695 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-dshm" (OuterVolumeSpecName: "dshm") pod "98c64805-15f2-4d73-bf1f-dc871c5eb243" (UID: "98c64805-15f2-4d73-bf1f-dc871c5eb243"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:13.107847 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.107807 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c64805-15f2-4d73-bf1f-dc871c5eb243-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "98c64805-15f2-4d73-bf1f-dc871c5eb243" (UID: "98c64805-15f2-4d73-bf1f-dc871c5eb243"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:04:13.108161 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.108146 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c64805-15f2-4d73-bf1f-dc871c5eb243-kube-api-access-vljl8" (OuterVolumeSpecName: "kube-api-access-vljl8") pod "98c64805-15f2-4d73-bf1f-dc871c5eb243" (UID: "98c64805-15f2-4d73-bf1f-dc871c5eb243"). InnerVolumeSpecName "kube-api-access-vljl8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:04:13.144053 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.144005 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "98c64805-15f2-4d73-bf1f-dc871c5eb243" (UID: "98c64805-15f2-4d73-bf1f-dc871c5eb243"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:13.207303 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.207220 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:13.207303 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.207258 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:13.207303 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.207274 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/98c64805-15f2-4d73-bf1f-dc871c5eb243-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:13.207303 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.207291 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vljl8\" (UniqueName: \"kubernetes.io/projected/98c64805-15f2-4d73-bf1f-dc871c5eb243-kube-api-access-vljl8\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:13.207742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.207305 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/98c64805-15f2-4d73-bf1f-dc871c5eb243-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:13.509589 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.509508 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5698dc8fcf-zspvz_98c64805-15f2-4d73-bf1f-dc871c5eb243/main/0.log" Apr 16 17:04:13.509875 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.509849 2572 generic.go:358] "Generic (PLEG): container finished" podID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerID="983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757" exitCode=137 Apr 16 17:04:13.509940 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.509918 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" Apr 16 17:04:13.509977 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.509941 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" event={"ID":"98c64805-15f2-4d73-bf1f-dc871c5eb243","Type":"ContainerDied","Data":"983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757"} Apr 16 17:04:13.509977 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.509971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz" event={"ID":"98c64805-15f2-4d73-bf1f-dc871c5eb243","Type":"ContainerDied","Data":"b42aa54d1579628c6ce6164d9d892f0fdf880bc03e599ae4c3dcbc059f0128ff"} Apr 16 17:04:13.510040 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.509987 2572 scope.go:117] "RemoveContainer" containerID="983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757" Apr 16 17:04:13.531327 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.531301 2572 scope.go:117] "RemoveContainer" containerID="58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435" Apr 16 17:04:13.533934 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.533909 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz"] Apr 16 17:04:13.537495 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.537475 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-zspvz"] Apr 16 17:04:13.592482 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.592442 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" path="/var/lib/kubelet/pods/98c64805-15f2-4d73-bf1f-dc871c5eb243/volumes" Apr 16 17:04:13.593076 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.593047 2572 scope.go:117] "RemoveContainer" containerID="983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757" Apr 16 17:04:13.593337 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:04:13.593316 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757\": container with ID starting with 983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757 not found: ID does not exist" containerID="983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757" Apr 16 17:04:13.593412 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.593350 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757"} err="failed to get container status \"983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757\": rpc error: code = NotFound desc = could not find container \"983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757\": container with ID starting with 983e2fea8f646698abbbcd625f075b631c820206e0912b2d837b27b37a28a757 not found: ID does not exist" Apr 16 17:04:13.593412 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.593374 2572 scope.go:117] "RemoveContainer" containerID="58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435" Apr 16 17:04:13.593663 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:04:13.593639 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435\": container with ID starting with 58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435 not found: ID does not exist" containerID="58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435" Apr 16 17:04:13.593713 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:13.593674 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435"} err="failed to get container status \"58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435\": rpc error: code = NotFound desc = could not find container \"58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435\": container with ID starting with 58d10f6b0dd411433b0947261a42a3a36a9d7307b00875249fdf854223ab7435 not found: ID does not exist" Apr 16 17:04:17.923772 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:17.923639 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx"] Apr 16 17:04:17.924266 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:17.924246 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" Apr 16 17:04:17.924331 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:17.924271 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" Apr 16 17:04:17.924331 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:17.924290 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="storage-initializer" Apr 16 17:04:17.924331 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:17.924298 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="storage-initializer" Apr 16 17:04:17.924460 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:17.924388 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="98c64805-15f2-4d73-bf1f-dc871c5eb243" containerName="main" Apr 16 17:04:17.930707 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:17.930671 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:17.934026 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:17.934001 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 17:04:17.938426 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:17.938394 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx"] Apr 16 17:04:18.047798 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.047746 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-model-cache\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.048000 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.047861 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-dshm\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.048000 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.047887 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gm6t\" (UniqueName: \"kubernetes.io/projected/2a52f9ca-448f-4e24-814a-6a163ca3e526-kube-api-access-6gm6t\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.048000 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.047947 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-kserve-provision-location\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.048000 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.047976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-home\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.048000 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.047993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a52f9ca-448f-4e24-814a-6a163ca3e526-tls-certs\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.148958 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.148919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-kserve-provision-location\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.149141 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.148971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-home\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.149190 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.149144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a52f9ca-448f-4e24-814a-6a163ca3e526-tls-certs\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.149262 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.149246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-model-cache\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.149339 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.149323 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-dshm\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.149414 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.149355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gm6t\" (UniqueName: \"kubernetes.io/projected/2a52f9ca-448f-4e24-814a-6a163ca3e526-kube-api-access-6gm6t\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.149414 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.149364 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-kserve-provision-location\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.149414 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.149402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-home\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.149596 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.149575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-model-cache\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.151475 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.151453 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-dshm\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.152586 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.152563 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a52f9ca-448f-4e24-814a-6a163ca3e526-tls-certs\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.157031 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.157007 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gm6t\" (UniqueName: \"kubernetes.io/projected/2a52f9ca-448f-4e24-814a-6a163ca3e526-kube-api-access-6gm6t\") pod \"router-with-refs-test-kserve-5959bdbb55-8fxgx\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.243469 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.243373 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:18.415704 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.415673 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx"] Apr 16 17:04:18.416757 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:04:18.416709 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a52f9ca_448f_4e24_814a_6a163ca3e526.slice/crio-607c10a5ac348450ba7dde7e16bf78a133878370a5b998273326aab1e99b68e5 WatchSource:0}: Error finding container 607c10a5ac348450ba7dde7e16bf78a133878370a5b998273326aab1e99b68e5: Status 404 returned error can't find the container with id 607c10a5ac348450ba7dde7e16bf78a133878370a5b998273326aab1e99b68e5 Apr 16 17:04:18.533280 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.533185 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" event={"ID":"2a52f9ca-448f-4e24-814a-6a163ca3e526","Type":"ContainerStarted","Data":"7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b"} Apr 16 17:04:18.533280 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:18.533236 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" event={"ID":"2a52f9ca-448f-4e24-814a-6a163ca3e526","Type":"ContainerStarted","Data":"607c10a5ac348450ba7dde7e16bf78a133878370a5b998273326aab1e99b68e5"} Apr 16 17:04:22.162558 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:22.162510 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8000/health\": dial tcp 10.133.0.57:8000: connect: connection refused" Apr 16 17:04:22.553936 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:22.553903 2572 generic.go:358] "Generic (PLEG): container finished" podID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerID="7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b" exitCode=0 Apr 16 17:04:22.554054 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:22.553979 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" event={"ID":"2a52f9ca-448f-4e24-814a-6a163ca3e526","Type":"ContainerDied","Data":"7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b"} Apr 16 17:04:23.560249 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:23.560213 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" event={"ID":"2a52f9ca-448f-4e24-814a-6a163ca3e526","Type":"ContainerStarted","Data":"039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b"} Apr 16 17:04:23.583269 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:23.583218 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podStartSLOduration=6.583199792 podStartE2EDuration="6.583199792s" podCreationTimestamp="2026-04-16 17:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:04:23.581037688 +0000 UTC m=+980.749749051" watchObservedRunningTime="2026-04-16 17:04:23.583199792 +0000 UTC m=+980.751911155" Apr 16 17:04:28.244375 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:28.244339 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:28.244375 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:28.244380 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:04:28.246168 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:28.246140 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 16 17:04:32.162533 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:32.162492 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8000/health\": dial tcp 10.133.0.57:8000: connect: connection refused" Apr 16 17:04:38.226934 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.226908 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-968f7c9f5-dcrlb_2e0a5eaa-4271-403b-8817-bb12bea1f93c/main/0.log" Apr 16 17:04:38.227395 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.227377 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:04:38.244307 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.244270 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 16 17:04:38.347518 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.347481 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4xcr\" (UniqueName: \"kubernetes.io/projected/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kube-api-access-p4xcr\") pod \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " Apr 16 17:04:38.347728 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.347539 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kserve-provision-location\") pod \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " Apr 16 17:04:38.347728 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.347583 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-dshm\") pod \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " Apr 16 17:04:38.347728 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.347603 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-home\") pod \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " Apr 16 17:04:38.347728 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.347657 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-model-cache\") pod \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " Apr 16 17:04:38.347728 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.347700 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0a5eaa-4271-403b-8817-bb12bea1f93c-tls-certs\") pod \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\" (UID: \"2e0a5eaa-4271-403b-8817-bb12bea1f93c\") " Apr 16 17:04:38.347998 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.347968 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-home" (OuterVolumeSpecName: "home") pod "2e0a5eaa-4271-403b-8817-bb12bea1f93c" (UID: "2e0a5eaa-4271-403b-8817-bb12bea1f93c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:38.348327 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.348276 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-model-cache" (OuterVolumeSpecName: "model-cache") pod "2e0a5eaa-4271-403b-8817-bb12bea1f93c" (UID: "2e0a5eaa-4271-403b-8817-bb12bea1f93c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:38.349952 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.349922 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0a5eaa-4271-403b-8817-bb12bea1f93c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2e0a5eaa-4271-403b-8817-bb12bea1f93c" (UID: "2e0a5eaa-4271-403b-8817-bb12bea1f93c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:04:38.350184 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.350158 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-dshm" (OuterVolumeSpecName: "dshm") pod "2e0a5eaa-4271-403b-8817-bb12bea1f93c" (UID: "2e0a5eaa-4271-403b-8817-bb12bea1f93c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:38.350184 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.350171 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kube-api-access-p4xcr" (OuterVolumeSpecName: "kube-api-access-p4xcr") pod "2e0a5eaa-4271-403b-8817-bb12bea1f93c" (UID: "2e0a5eaa-4271-403b-8817-bb12bea1f93c"). InnerVolumeSpecName "kube-api-access-p4xcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:04:38.389256 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.389197 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2e0a5eaa-4271-403b-8817-bb12bea1f93c" (UID: "2e0a5eaa-4271-403b-8817-bb12bea1f93c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:38.448766 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.448721 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4xcr\" (UniqueName: \"kubernetes.io/projected/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kube-api-access-p4xcr\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:38.448766 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.448767 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:38.449022 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.448783 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:38.449022 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.448799 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:38.449022 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.448812 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e0a5eaa-4271-403b-8817-bb12bea1f93c-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:38.449022 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.448827 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0a5eaa-4271-403b-8817-bb12bea1f93c-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:04:38.628516 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.628485 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-968f7c9f5-dcrlb_2e0a5eaa-4271-403b-8817-bb12bea1f93c/main/0.log" Apr 16 17:04:38.628917 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.628890 2572 generic.go:358] "Generic (PLEG): container finished" podID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerID="efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9" exitCode=137 Apr 16 17:04:38.629002 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.628977 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" event={"ID":"2e0a5eaa-4271-403b-8817-bb12bea1f93c","Type":"ContainerDied","Data":"efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9"} Apr 16 17:04:38.629043 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.629027 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" event={"ID":"2e0a5eaa-4271-403b-8817-bb12bea1f93c","Type":"ContainerDied","Data":"b788e0f0ad4b47455add75f92df3c4968c48282dbe766d55e3b9be2b64abdcff"} Apr 16 17:04:38.629096 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.629049 2572 scope.go:117] "RemoveContainer" containerID="efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9" Apr 16 17:04:38.629096 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.628992 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb" Apr 16 17:04:38.649764 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.649739 2572 scope.go:117] "RemoveContainer" containerID="3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6" Apr 16 17:04:38.657839 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.657808 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb"] Apr 16 17:04:38.662147 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.662117 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-968f7c9f5-dcrlb"] Apr 16 17:04:38.703372 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.703348 2572 scope.go:117] "RemoveContainer" containerID="efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9" Apr 16 17:04:38.703786 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:04:38.703761 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9\": container with ID starting with efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9 not found: ID does not exist" containerID="efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9" Apr 16 17:04:38.703891 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.703796 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9"} err="failed to get container status \"efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9\": rpc error: code = NotFound desc = could not find container \"efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9\": container with ID starting with efb53025fdfb2eb34072908ddc48283ab758a0658271c84eee81ecf2ab27ccf9 not found: ID does not exist" Apr 16 17:04:38.703891 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.703818 2572 scope.go:117] "RemoveContainer" containerID="3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6" Apr 16 17:04:38.704124 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:04:38.704104 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6\": container with ID starting with 3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6 not found: ID does not exist" containerID="3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6" Apr 16 17:04:38.704208 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:38.704130 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6"} err="failed to get container status \"3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6\": rpc error: code = NotFound desc = could not find container \"3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6\": container with ID starting with 3ee32af601795275c35b982d6f23a790be54956dbf50ae7392e95473988961f6 not found: ID does not exist" Apr 16 17:04:39.592753 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:39.592716 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" path="/var/lib/kubelet/pods/2e0a5eaa-4271-403b-8817-bb12bea1f93c/volumes" Apr 16 17:04:42.162285 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:42.162232 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8000/health\": dial tcp 10.133.0.57:8000: connect: connection refused" Apr 16 17:04:48.244871 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:48.244823 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 16 17:04:52.163212 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:52.163161 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8000/health\": dial tcp 10.133.0.57:8000: connect: connection refused" Apr 16 17:04:58.244770 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:04:58.244713 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 16 17:05:02.163013 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:02.162965 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8000/health\": dial tcp 10.133.0.57:8000: connect: connection refused" Apr 16 17:05:08.244838 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:08.244779 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 16 17:05:12.162158 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:12.162112 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8000/health\": dial tcp 10.133.0.57:8000: connect: connection refused" Apr 16 17:05:18.244041 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:18.243996 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 16 17:05:22.162558 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:22.162510 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8000/health\": dial tcp 10.133.0.57:8000: connect: connection refused" Apr 16 17:05:28.244298 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:28.244253 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 16 17:05:32.162658 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:32.162614 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8000/health\": dial tcp 10.133.0.57:8000: connect: connection refused" Apr 16 17:05:38.243835 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:38.243789 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 16 17:05:42.171922 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:42.171889 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:05:42.179994 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:42.179958 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:05:43.176961 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:43.176922 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf"] Apr 16 17:05:43.903552 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:43.903478 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" containerID="cri-o://49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b" gracePeriod=30 Apr 16 17:05:48.244712 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:48.244615 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 16 17:05:58.244806 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:05:58.244757 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 16 17:06:08.253889 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:08.253857 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:06:08.261710 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:08.261680 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:06:14.168758 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.168702 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5698dc8fcf-ng9bf_1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6/main/0.log" Apr 16 17:06:14.169133 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.169036 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:06:14.197222 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.196497 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kserve-provision-location\") pod \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " Apr 16 17:06:14.197222 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.196558 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-home\") pod \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " Apr 16 17:06:14.197222 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.196613 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-model-cache\") pod \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " Apr 16 17:06:14.197222 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.196641 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75dck\" (UniqueName: \"kubernetes.io/projected/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kube-api-access-75dck\") pod \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " Apr 16 17:06:14.197222 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.196695 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-tls-certs\") pod \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " Apr 16 17:06:14.197222 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.196724 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-dshm\") pod \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\" (UID: \"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6\") " Apr 16 17:06:14.197625 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.197338 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-model-cache" (OuterVolumeSpecName: "model-cache") pod "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" (UID: "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:14.197680 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.197631 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-home" (OuterVolumeSpecName: "home") pod "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" (UID: "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:14.201704 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.201657 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" (UID: "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:06:14.203172 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.203132 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-dshm" (OuterVolumeSpecName: "dshm") pod "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" (UID: "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:14.207505 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.207463 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kube-api-access-75dck" (OuterVolumeSpecName: "kube-api-access-75dck") pod "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" (UID: "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6"). InnerVolumeSpecName "kube-api-access-75dck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:06:14.274169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.274126 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" (UID: "1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:14.298306 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.298250 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:14.298306 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.298305 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:14.298603 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.298324 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:14.298603 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.298360 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:14.298603 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.298382 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:14.298603 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.298399 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-75dck\" (UniqueName: \"kubernetes.io/projected/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6-kube-api-access-75dck\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:14.485393 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.485311 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx"] Apr 16 17:06:14.485641 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:14.485608 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" containerID="cri-o://039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b" gracePeriod=30 Apr 16 17:06:15.028763 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.028737 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5698dc8fcf-ng9bf_1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6/main/0.log" Apr 16 17:06:15.029167 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.029141 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerID="49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b" exitCode=137 Apr 16 17:06:15.029284 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.029222 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" Apr 16 17:06:15.029334 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.029223 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" event={"ID":"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6","Type":"ContainerDied","Data":"49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b"} Apr 16 17:06:15.029334 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.029328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf" event={"ID":"1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6","Type":"ContainerDied","Data":"360b8965675e566671edb8186305f819bd88d50b6414c788884d16091465f90e"} Apr 16 17:06:15.029420 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.029348 2572 scope.go:117] "RemoveContainer" containerID="49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b" Apr 16 17:06:15.051838 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.051813 2572 scope.go:117] "RemoveContainer" containerID="8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c" Apr 16 17:06:15.052578 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.052554 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf"] Apr 16 17:06:15.055867 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.055849 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5698dc8fcf-ng9bf"] Apr 16 17:06:15.120907 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.120885 2572 scope.go:117] "RemoveContainer" containerID="49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b" Apr 16 17:06:15.121252 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:06:15.121228 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b\": container with ID starting with 49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b not found: ID does not exist" containerID="49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b" Apr 16 17:06:15.121353 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.121262 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b"} err="failed to get container status \"49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b\": rpc error: code = NotFound desc = could not find container \"49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b\": container with ID starting with 49685ce5b9dd4ec66f8a734083c700255c41f7d3f929a0aae16b166eb01e051b not found: ID does not exist" Apr 16 17:06:15.121353 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.121283 2572 scope.go:117] "RemoveContainer" containerID="8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c" Apr 16 17:06:15.121555 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:06:15.121538 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c\": container with ID starting with 8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c not found: ID does not exist" containerID="8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c" Apr 16 17:06:15.121595 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.121566 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c"} err="failed to get container status \"8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c\": rpc error: code = NotFound desc = could not find container \"8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c\": container with ID starting with 8005457c02fe2f3b90c5cc676f5e6ca7d1f6cb7e5fbd40b033ce49b5c682765c not found: ID does not exist" Apr 16 17:06:15.590903 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:15.590867 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" path="/var/lib/kubelet/pods/1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6/volumes" Apr 16 17:06:20.792139 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:20.792104 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr"] Apr 16 17:06:20.792532 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:20.792332 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" podUID="56bec0b1-0056-4327-9ab4-ceafc5a4df9f" containerName="manager" containerID="cri-o://ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d" gracePeriod=30 Apr 16 17:06:23.940161 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:23.940138 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 17:06:23.979162 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:23.979134 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-cert\") pod \"56bec0b1-0056-4327-9ab4-ceafc5a4df9f\" (UID: \"56bec0b1-0056-4327-9ab4-ceafc5a4df9f\") " Apr 16 17:06:23.979333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:23.979170 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf9gz\" (UniqueName: \"kubernetes.io/projected/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-kube-api-access-qf9gz\") pod \"56bec0b1-0056-4327-9ab4-ceafc5a4df9f\" (UID: \"56bec0b1-0056-4327-9ab4-ceafc5a4df9f\") " Apr 16 17:06:23.981175 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:23.981145 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-cert" (OuterVolumeSpecName: "cert") pod "56bec0b1-0056-4327-9ab4-ceafc5a4df9f" (UID: "56bec0b1-0056-4327-9ab4-ceafc5a4df9f"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:06:23.981276 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:23.981178 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-kube-api-access-qf9gz" (OuterVolumeSpecName: "kube-api-access-qf9gz") pod "56bec0b1-0056-4327-9ab4-ceafc5a4df9f" (UID: "56bec0b1-0056-4327-9ab4-ceafc5a4df9f"). InnerVolumeSpecName "kube-api-access-qf9gz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:06:24.063886 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:24.063801 2572 generic.go:358] "Generic (PLEG): container finished" podID="56bec0b1-0056-4327-9ab4-ceafc5a4df9f" containerID="ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d" exitCode=0 Apr 16 17:06:24.063886 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:24.063864 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" Apr 16 17:06:24.064105 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:24.063883 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" event={"ID":"56bec0b1-0056-4327-9ab4-ceafc5a4df9f","Type":"ContainerDied","Data":"ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d"} Apr 16 17:06:24.064105 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:24.063917 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr" event={"ID":"56bec0b1-0056-4327-9ab4-ceafc5a4df9f","Type":"ContainerDied","Data":"445a98decf9a3e393b2d28501d663daac3f1b2372d403dcbc058d2e6e3c63f27"} Apr 16 17:06:24.064105 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:24.063931 2572 scope.go:117] "RemoveContainer" containerID="ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d" Apr 16 17:06:24.073570 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:24.073549 2572 scope.go:117] "RemoveContainer" containerID="ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d" Apr 16 17:06:24.073811 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:06:24.073792 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d\": container with ID starting with ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d not found: ID does not exist" containerID="ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d" Apr 16 17:06:24.073880 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:24.073818 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d"} err="failed to get container status \"ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d\": rpc error: code = NotFound desc = could not find container \"ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d\": container with ID starting with ad4dc70c7a9d48a93b5b3d0eca06bdb7fc679ce9a4af2ae3701a26d705f0099d not found: ID does not exist" Apr 16 17:06:24.080726 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:24.080711 2572 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-cert\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:24.080778 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:24.080730 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qf9gz\" (UniqueName: \"kubernetes.io/projected/56bec0b1-0056-4327-9ab4-ceafc5a4df9f-kube-api-access-qf9gz\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:24.085767 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:24.085742 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr"] Apr 16 17:06:24.090982 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:24.090960 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-7c8b759dfd-qjwzr"] Apr 16 17:06:25.590764 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:25.590726 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56bec0b1-0056-4327-9ab4-ceafc5a4df9f" path="/var/lib/kubelet/pods/56bec0b1-0056-4327-9ab4-ceafc5a4df9f/volumes" Apr 16 17:06:44.724006 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.723982 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-5959bdbb55-8fxgx_2a52f9ca-448f-4e24-814a-6a163ca3e526/main/0.log" Apr 16 17:06:44.724338 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.724288 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:06:44.855838 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.855760 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gm6t\" (UniqueName: \"kubernetes.io/projected/2a52f9ca-448f-4e24-814a-6a163ca3e526-kube-api-access-6gm6t\") pod \"2a52f9ca-448f-4e24-814a-6a163ca3e526\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " Apr 16 17:06:44.855838 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.855791 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-home\") pod \"2a52f9ca-448f-4e24-814a-6a163ca3e526\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " Apr 16 17:06:44.855838 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.855810 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-model-cache\") pod \"2a52f9ca-448f-4e24-814a-6a163ca3e526\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " Apr 16 17:06:44.855838 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.855828 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-kserve-provision-location\") pod \"2a52f9ca-448f-4e24-814a-6a163ca3e526\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " Apr 16 17:06:44.855838 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.855842 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a52f9ca-448f-4e24-814a-6a163ca3e526-tls-certs\") pod \"2a52f9ca-448f-4e24-814a-6a163ca3e526\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " Apr 16 17:06:44.856268 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.855890 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-dshm\") pod \"2a52f9ca-448f-4e24-814a-6a163ca3e526\" (UID: \"2a52f9ca-448f-4e24-814a-6a163ca3e526\") " Apr 16 17:06:44.856268 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.856050 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-model-cache" (OuterVolumeSpecName: "model-cache") pod "2a52f9ca-448f-4e24-814a-6a163ca3e526" (UID: "2a52f9ca-448f-4e24-814a-6a163ca3e526"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:44.856268 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.856161 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:44.856268 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.856193 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-home" (OuterVolumeSpecName: "home") pod "2a52f9ca-448f-4e24-814a-6a163ca3e526" (UID: "2a52f9ca-448f-4e24-814a-6a163ca3e526"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:44.857987 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.857954 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a52f9ca-448f-4e24-814a-6a163ca3e526-kube-api-access-6gm6t" (OuterVolumeSpecName: "kube-api-access-6gm6t") pod "2a52f9ca-448f-4e24-814a-6a163ca3e526" (UID: "2a52f9ca-448f-4e24-814a-6a163ca3e526"). InnerVolumeSpecName "kube-api-access-6gm6t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:06:44.858112 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.858001 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a52f9ca-448f-4e24-814a-6a163ca3e526-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2a52f9ca-448f-4e24-814a-6a163ca3e526" (UID: "2a52f9ca-448f-4e24-814a-6a163ca3e526"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:06:44.858112 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.858012 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-dshm" (OuterVolumeSpecName: "dshm") pod "2a52f9ca-448f-4e24-814a-6a163ca3e526" (UID: "2a52f9ca-448f-4e24-814a-6a163ca3e526"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:44.923851 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.923810 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2a52f9ca-448f-4e24-814a-6a163ca3e526" (UID: "2a52f9ca-448f-4e24-814a-6a163ca3e526"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:44.956530 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.956506 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:44.956530 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.956529 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a52f9ca-448f-4e24-814a-6a163ca3e526-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:44.956693 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.956539 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:44.956693 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.956549 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6gm6t\" (UniqueName: \"kubernetes.io/projected/2a52f9ca-448f-4e24-814a-6a163ca3e526-kube-api-access-6gm6t\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:44.956693 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:44.956558 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2a52f9ca-448f-4e24-814a-6a163ca3e526-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:06:45.143972 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.143947 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-5959bdbb55-8fxgx_2a52f9ca-448f-4e24-814a-6a163ca3e526/main/0.log" Apr 16 17:06:45.144300 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.144276 2572 generic.go:358] "Generic (PLEG): container finished" podID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerID="039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b" exitCode=137 Apr 16 17:06:45.144373 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.144360 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" Apr 16 17:06:45.144430 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.144363 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" event={"ID":"2a52f9ca-448f-4e24-814a-6a163ca3e526","Type":"ContainerDied","Data":"039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b"} Apr 16 17:06:45.144430 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.144409 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx" event={"ID":"2a52f9ca-448f-4e24-814a-6a163ca3e526","Type":"ContainerDied","Data":"607c10a5ac348450ba7dde7e16bf78a133878370a5b998273326aab1e99b68e5"} Apr 16 17:06:45.144534 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.144433 2572 scope.go:117] "RemoveContainer" containerID="039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b" Apr 16 17:06:45.164541 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.164524 2572 scope.go:117] "RemoveContainer" containerID="7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b" Apr 16 17:06:45.169610 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.169588 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx"] Apr 16 17:06:45.174797 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.174780 2572 scope.go:117] "RemoveContainer" containerID="039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b" Apr 16 17:06:45.175054 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:06:45.175033 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b\": container with ID starting with 039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b not found: ID does not exist" containerID="039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b" Apr 16 17:06:45.175128 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.175056 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5959bdbb55-8fxgx"] Apr 16 17:06:45.175128 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.175082 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b"} err="failed to get container status \"039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b\": rpc error: code = NotFound desc = could not find container \"039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b\": container with ID starting with 039901f35a7e6ee7aad462137c8deb034a67a6b24ebfb74ccd9a4264685e557b not found: ID does not exist" Apr 16 17:06:45.175128 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.175102 2572 scope.go:117] "RemoveContainer" containerID="7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b" Apr 16 17:06:45.175362 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:06:45.175345 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b\": container with ID starting with 7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b not found: ID does not exist" containerID="7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b" Apr 16 17:06:45.175408 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.175366 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b"} err="failed to get container status \"7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b\": rpc error: code = NotFound desc = could not find container \"7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b\": container with ID starting with 7bfc0efb2d135606427facc6fc44fb4b7a29800d6573881a1b418264e031793b not found: ID does not exist" Apr 16 17:06:45.590849 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:45.590771 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" path="/var/lib/kubelet/pods/2a52f9ca-448f-4e24-814a-6a163ca3e526/volumes" Apr 16 17:06:47.415774 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.415738 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l"] Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416086 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="storage-initializer" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416097 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="storage-initializer" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416107 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416113 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416124 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416129 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416134 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="storage-initializer" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416139 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="storage-initializer" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416146 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="storage-initializer" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416152 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="storage-initializer" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416159 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56bec0b1-0056-4327-9ab4-ceafc5a4df9f" containerName="manager" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416163 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bec0b1-0056-4327-9ab4-ceafc5a4df9f" containerName="manager" Apr 16 17:06:47.416169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416175 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" Apr 16 17:06:47.416576 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416181 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" Apr 16 17:06:47.416576 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416230 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f6e7d4b-cd5b-43ba-b1b5-95c01c4763c6" containerName="main" Apr 16 17:06:47.416576 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416239 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e0a5eaa-4271-403b-8817-bb12bea1f93c" containerName="main" Apr 16 17:06:47.416576 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416244 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="56bec0b1-0056-4327-9ab4-ceafc5a4df9f" containerName="manager" Apr 16 17:06:47.416576 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.416250 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a52f9ca-448f-4e24-814a-6a163ca3e526" containerName="main" Apr 16 17:06:47.421847 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.421823 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.424857 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.424836 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 17:06:47.424983 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.424876 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-g4psd\"" Apr 16 17:06:47.424983 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.424892 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 17:06:47.425816 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.425796 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-6t4kz\"" Apr 16 17:06:47.425900 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.425808 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 17:06:47.432315 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.432287 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l"] Apr 16 17:06:47.435106 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.435057 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27"] Apr 16 17:06:47.438671 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.438653 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.453855 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.453832 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27"] Apr 16 17:06:47.576881 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.576852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.577050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.576889 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnftv\" (UniqueName: \"kubernetes.io/projected/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kube-api-access-mnftv\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.577050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.576910 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7c9q\" (UniqueName: \"kubernetes.io/projected/4b02bc96-301a-4de1-87fb-96d881dc23d4-kube-api-access-l7c9q\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.577050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.576946 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b02bc96-301a-4de1-87fb-96d881dc23d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.577050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.577000 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.577050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.577020 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.577050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.577038 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.577276 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.577082 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.577276 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.577109 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.577276 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.577131 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.577276 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.577148 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.577276 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.577188 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.678477 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b02bc96-301a-4de1-87fb-96d881dc23d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.678477 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678436 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.678477 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.678477 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.678819 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.678819 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678649 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.678819 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.678819 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678698 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.678819 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678721 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.678819 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.678819 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnftv\" (UniqueName: \"kubernetes.io/projected/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kube-api-access-mnftv\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.679245 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678834 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7c9q\" (UniqueName: \"kubernetes.io/projected/4b02bc96-301a-4de1-87fb-96d881dc23d4-kube-api-access-l7c9q\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.679245 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.678854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.679245 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.679021 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.679245 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.679049 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.679466 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.679320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.679466 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.679352 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.679626 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.679552 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.680847 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.680824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.681091 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.681057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.681183 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.681165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b02bc96-301a-4de1-87fb-96d881dc23d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.681368 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.681349 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.687728 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.687707 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7c9q\" (UniqueName: \"kubernetes.io/projected/4b02bc96-301a-4de1-87fb-96d881dc23d4-kube-api-access-l7c9q\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.688184 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.688165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnftv\" (UniqueName: \"kubernetes.io/projected/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kube-api-access-mnftv\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:47.732153 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.732132 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:47.752602 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:47.752553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:48.074964 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:48.074938 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l"] Apr 16 17:06:48.075685 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:06:48.075659 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b02bc96_301a_4de1_87fb_96d881dc23d4.slice/crio-f3c2b97b8490b13fef5fb7f15dcf6d4adaffec062e8d852abb8643a911d0fa92 WatchSource:0}: Error finding container f3c2b97b8490b13fef5fb7f15dcf6d4adaffec062e8d852abb8643a911d0fa92: Status 404 returned error can't find the container with id f3c2b97b8490b13fef5fb7f15dcf6d4adaffec062e8d852abb8643a911d0fa92 Apr 16 17:06:48.098304 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:48.098272 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27"] Apr 16 17:06:48.099712 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:06:48.099689 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf161f8a8_c5d6_497d_bb8b_c1b73ceb89e2.slice/crio-de321fc8b2ac437f2e79c8d95b652bd15067c782f55af43d398830b23bdd92fb WatchSource:0}: Error finding container de321fc8b2ac437f2e79c8d95b652bd15067c782f55af43d398830b23bdd92fb: Status 404 returned error can't find the container with id de321fc8b2ac437f2e79c8d95b652bd15067c782f55af43d398830b23bdd92fb Apr 16 17:06:48.157915 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:48.157885 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" event={"ID":"4b02bc96-301a-4de1-87fb-96d881dc23d4","Type":"ContainerStarted","Data":"f3c2b97b8490b13fef5fb7f15dcf6d4adaffec062e8d852abb8643a911d0fa92"} Apr 16 17:06:48.159242 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:48.159222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" event={"ID":"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2","Type":"ContainerStarted","Data":"951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b"} Apr 16 17:06:48.159332 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:48.159248 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" event={"ID":"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2","Type":"ContainerStarted","Data":"de321fc8b2ac437f2e79c8d95b652bd15067c782f55af43d398830b23bdd92fb"} Apr 16 17:06:49.165426 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:49.165388 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" event={"ID":"4b02bc96-301a-4de1-87fb-96d881dc23d4","Type":"ContainerStarted","Data":"88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054"} Apr 16 17:06:49.165969 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:49.165462 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:50.172426 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:50.172390 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" event={"ID":"4b02bc96-301a-4de1-87fb-96d881dc23d4","Type":"ContainerStarted","Data":"3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24"} Apr 16 17:06:53.192764 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:53.192726 2572 generic.go:358] "Generic (PLEG): container finished" podID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerID="951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b" exitCode=0 Apr 16 17:06:53.193237 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:53.192795 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" event={"ID":"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2","Type":"ContainerDied","Data":"951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b"} Apr 16 17:06:54.198241 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:54.198203 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" event={"ID":"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2","Type":"ContainerStarted","Data":"a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6"} Apr 16 17:06:54.200124 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:54.200098 2572 generic.go:358] "Generic (PLEG): container finished" podID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerID="3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24" exitCode=0 Apr 16 17:06:54.200270 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:54.200168 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" event={"ID":"4b02bc96-301a-4de1-87fb-96d881dc23d4","Type":"ContainerDied","Data":"3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24"} Apr 16 17:06:54.224418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:54.224362 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podStartSLOduration=7.224345182 podStartE2EDuration="7.224345182s" podCreationTimestamp="2026-04-16 17:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:06:54.219366102 +0000 UTC m=+1131.388077467" watchObservedRunningTime="2026-04-16 17:06:54.224345182 +0000 UTC m=+1131.393056545" Apr 16 17:06:55.206270 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:55.206229 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" event={"ID":"4b02bc96-301a-4de1-87fb-96d881dc23d4","Type":"ContainerStarted","Data":"00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc"} Apr 16 17:06:55.230085 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:55.230002 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podStartSLOduration=7.316192399 podStartE2EDuration="8.229981551s" podCreationTimestamp="2026-04-16 17:06:47 +0000 UTC" firstStartedPulling="2026-04-16 17:06:48.077472956 +0000 UTC m=+1125.246184297" lastFinishedPulling="2026-04-16 17:06:48.991262095 +0000 UTC m=+1126.159973449" observedRunningTime="2026-04-16 17:06:55.226835735 +0000 UTC m=+1132.395547101" watchObservedRunningTime="2026-04-16 17:06:55.229981551 +0000 UTC m=+1132.398692915" Apr 16 17:06:57.732785 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:57.732741 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:57.732785 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:57.732789 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:06:57.734242 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:57.734208 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:06:57.752979 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:57.752952 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:57.753085 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:57.752999 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:06:57.754234 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:06:57.754208 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:07:07.733461 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.733403 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:07:07.753186 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.753145 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:07:07.760963 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.760934 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:07:07.871870 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.871837 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22"] Apr 16 17:07:07.907148 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.907046 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22"] Apr 16 17:07:07.907331 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.907213 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:07.910412 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.910385 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 17:07:07.962710 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.962676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:07.962890 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.962727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:07.962890 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.962757 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:07.962890 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.962828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd093de-7558-445a-af48-f9c7a9d6b76c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:07.963094 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.962898 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm68x\" (UniqueName: \"kubernetes.io/projected/8bd093de-7558-445a-af48-f9c7a9d6b76c-kube-api-access-qm68x\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:07.963094 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:07.962945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.064037 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.063915 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.064235 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.064035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.064235 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.064135 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.064235 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.064172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd093de-7558-445a-af48-f9c7a9d6b76c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.064481 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.064453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qm68x\" (UniqueName: \"kubernetes.io/projected/8bd093de-7558-445a-af48-f9c7a9d6b76c-kube-api-access-qm68x\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.064574 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.064524 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.064574 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.064528 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.065086 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.065044 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.065315 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.065286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.066522 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.066500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.066930 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.066910 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd093de-7558-445a-af48-f9c7a9d6b76c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.073606 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.073580 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm68x\" (UniqueName: \"kubernetes.io/projected/8bd093de-7558-445a-af48-f9c7a9d6b76c-kube-api-access-qm68x\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.217627 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.217585 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:08.365719 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:08.365621 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22"] Apr 16 17:07:08.368695 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:07:08.368663 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bd093de_7558_445a_af48_f9c7a9d6b76c.slice/crio-881e15999b4f4fe32eb186cdee45f548bb0fece2f3ac409a2db1bd22abf0df22 WatchSource:0}: Error finding container 881e15999b4f4fe32eb186cdee45f548bb0fece2f3ac409a2db1bd22abf0df22: Status 404 returned error can't find the container with id 881e15999b4f4fe32eb186cdee45f548bb0fece2f3ac409a2db1bd22abf0df22 Apr 16 17:07:09.272237 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:09.272200 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" event={"ID":"8bd093de-7558-445a-af48-f9c7a9d6b76c","Type":"ContainerStarted","Data":"c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da"} Apr 16 17:07:09.272645 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:09.272245 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" event={"ID":"8bd093de-7558-445a-af48-f9c7a9d6b76c","Type":"ContainerStarted","Data":"881e15999b4f4fe32eb186cdee45f548bb0fece2f3ac409a2db1bd22abf0df22"} Apr 16 17:07:13.291696 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:13.291649 2572 generic.go:358] "Generic (PLEG): container finished" podID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerID="c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da" exitCode=0 Apr 16 17:07:13.292207 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:13.291721 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" event={"ID":"8bd093de-7558-445a-af48-f9c7a9d6b76c","Type":"ContainerDied","Data":"c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da"} Apr 16 17:07:14.298222 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:14.298176 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" event={"ID":"8bd093de-7558-445a-af48-f9c7a9d6b76c","Type":"ContainerStarted","Data":"bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3"} Apr 16 17:07:14.320945 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:14.320876 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podStartSLOduration=7.320857102 podStartE2EDuration="7.320857102s" podCreationTimestamp="2026-04-16 17:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:07:14.319386094 +0000 UTC m=+1151.488097462" watchObservedRunningTime="2026-04-16 17:07:14.320857102 +0000 UTC m=+1151.489568465" Apr 16 17:07:17.733563 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:17.733447 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:07:17.753598 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:17.753555 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:07:18.218139 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:18.218095 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:18.218373 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:18.218153 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:07:18.219891 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:18.219855 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:07:27.733124 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:27.733051 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:07:27.753753 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:27.753706 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:07:28.218667 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:28.218619 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:07:37.732962 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:37.732890 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:07:37.754376 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:37.754318 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:07:38.218393 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:38.218317 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:07:47.733369 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:47.733313 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:07:47.753715 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:47.753676 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:07:48.218620 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:48.218559 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:07:57.732765 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:57.732714 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:07:57.752820 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:57.752787 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:07:58.218122 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:07:58.218054 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:08:03.572141 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:03.572112 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 17:08:03.574109 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:03.574088 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 17:08:07.733475 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:07.733418 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:08:07.753635 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:07.753599 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:08:08.218753 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:08.218701 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:08:17.733421 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:17.733365 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:08:17.753035 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:17.753001 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:08:18.218806 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:18.218757 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:08:27.733327 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:27.733276 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:08:27.753128 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:27.753087 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:08:28.218372 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:28.218333 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:08:37.733457 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:37.733404 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:08:37.753670 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:37.753628 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:08:38.218008 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:38.217958 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:08:47.733157 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:47.733042 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:08:47.753859 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:47.753819 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:08:48.218748 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:48.218700 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:08:57.732877 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:57.732829 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:08:57.753044 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:57.753005 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:08:58.218272 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:08:58.218232 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:09:07.732722 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:07.732668 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:09:07.752903 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:07.752862 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:09:08.218908 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:08.218855 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:09:17.732667 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:17.732625 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:09:17.753754 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:17.753720 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:09:18.219032 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:18.218980 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:09:27.732910 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:27.732861 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 16 17:09:27.753206 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:27.753165 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.60:8000/health\": dial tcp 10.133.0.60:8000: connect: connection refused" Apr 16 17:09:28.218612 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:28.218563 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:09:37.742050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:37.742019 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:09:37.760292 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:37.760265 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:09:37.762862 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:37.762844 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:09:37.770362 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:37.770335 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:09:38.218394 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:38.218344 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.61:8000/health\": dial tcp 10.133.0.61:8000: connect: connection refused" Apr 16 17:09:48.231557 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:48.231521 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:09:48.238955 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:48.238931 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:09:51.592278 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:51.592231 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l"] Apr 16 17:09:51.592842 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:51.592784 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" containerID="cri-o://00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc" gracePeriod=30 Apr 16 17:09:51.595708 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:51.595685 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27"] Apr 16 17:09:51.596034 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:09:51.595993 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" containerID="cri-o://a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6" gracePeriod=30 Apr 16 17:10:19.793747 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.793657 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl"] Apr 16 17:10:19.806266 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.805475 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:19.807708 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.807682 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz"] Apr 16 17:10:19.809898 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.809873 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 17:10:19.810155 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.810132 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-vfp2t\"" Apr 16 17:10:19.811893 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.811871 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl"] Apr 16 17:10:19.811998 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.811988 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:19.820967 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.820942 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz"] Apr 16 17:10:19.941763 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.941720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-home\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:19.941763 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.941768 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:19.941977 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.941790 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:19.941977 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.941849 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:19.941977 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.941889 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:19.941977 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.941915 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:19.941977 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.941933 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-model-cache\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:19.941977 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.941958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnqhd\" (UniqueName: \"kubernetes.io/projected/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kube-api-access-rnqhd\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:19.942200 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.941979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h7j8\" (UniqueName: \"kubernetes.io/projected/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kube-api-access-7h7j8\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:19.942200 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.942000 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f13b4cc2-1661-4f97-af96-93ea0d79f1af-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:19.942200 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.942094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-dshm\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:19.942200 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:19.942132 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.042968 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.042935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-home\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.042968 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.042975 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.043251 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.042995 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.043251 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043047 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.043251 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.043251 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.043251 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043190 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-model-cache\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.043251 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnqhd\" (UniqueName: \"kubernetes.io/projected/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kube-api-access-rnqhd\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.043579 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.043648 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-model-cache\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.043648 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043611 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.043919 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043859 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-home\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.043919 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043883 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.043919 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043243 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7h7j8\" (UniqueName: \"kubernetes.io/projected/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kube-api-access-7h7j8\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.044166 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043940 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f13b4cc2-1661-4f97-af96-93ea0d79f1af-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.044166 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.043998 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-dshm\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.044166 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.044031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.044392 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.044369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.045909 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.045882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.046365 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.046340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.046558 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.046540 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-dshm\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.046734 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.046718 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f13b4cc2-1661-4f97-af96-93ea0d79f1af-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.057421 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.057397 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnqhd\" (UniqueName: \"kubernetes.io/projected/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kube-api-access-rnqhd\") pod \"custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.057511 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.057473 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h7j8\" (UniqueName: \"kubernetes.io/projected/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kube-api-access-7h7j8\") pod \"custom-route-timeout-pd-test-kserve-855cffc847-fzvhl\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.119413 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.119386 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:20.126127 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.126105 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:20.296927 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.296895 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz"] Apr 16 17:10:20.298867 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:10:20.298841 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23f8a34b_3c7e_4539_a8b5_43fac58d50e8.slice/crio-4eb8a13e45b2ce6112ae03a10bacc345c9c969fc7bd135478728f1873f3fae8f WatchSource:0}: Error finding container 4eb8a13e45b2ce6112ae03a10bacc345c9c969fc7bd135478728f1873f3fae8f: Status 404 returned error can't find the container with id 4eb8a13e45b2ce6112ae03a10bacc345c9c969fc7bd135478728f1873f3fae8f Apr 16 17:10:20.300765 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.300731 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:10:20.318605 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:20.318547 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl"] Apr 16 17:10:21.077707 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:21.077665 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" event={"ID":"23f8a34b-3c7e-4539-a8b5-43fac58d50e8","Type":"ContainerStarted","Data":"bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0"} Apr 16 17:10:21.078150 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:21.077713 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" event={"ID":"23f8a34b-3c7e-4539-a8b5-43fac58d50e8","Type":"ContainerStarted","Data":"4eb8a13e45b2ce6112ae03a10bacc345c9c969fc7bd135478728f1873f3fae8f"} Apr 16 17:10:21.079011 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:21.078989 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" event={"ID":"f13b4cc2-1661-4f97-af96-93ea0d79f1af","Type":"ContainerStarted","Data":"c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab"} Apr 16 17:10:21.079140 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:21.079019 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" event={"ID":"f13b4cc2-1661-4f97-af96-93ea0d79f1af","Type":"ContainerStarted","Data":"f6067c6f9bd3470932f90e0ab22c888384cc1213fd44854b8ac878e07b7ed48a"} Apr 16 17:10:21.079140 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:21.079100 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:21.473390 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:21.473355 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22"] Apr 16 17:10:21.473710 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:21.473686 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" containerID="cri-o://bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3" gracePeriod=30 Apr 16 17:10:21.592888 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:21.592846 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="llm-d-routing-sidecar" containerID="cri-o://88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054" gracePeriod=2 Apr 16 17:10:21.928497 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:21.928466 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:10:21.988363 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:21.987478 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l_4b02bc96-301a-4de1-87fb-96d881dc23d4/main/0.log" Apr 16 17:10:21.988363 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:21.988289 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:10:22.063863 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.063838 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7c9q\" (UniqueName: \"kubernetes.io/projected/4b02bc96-301a-4de1-87fb-96d881dc23d4-kube-api-access-l7c9q\") pod \"4b02bc96-301a-4de1-87fb-96d881dc23d4\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " Apr 16 17:10:22.063983 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.063869 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-dshm\") pod \"4b02bc96-301a-4de1-87fb-96d881dc23d4\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " Apr 16 17:10:22.063983 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.063911 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-tls-certs\") pod \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " Apr 16 17:10:22.063983 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.063934 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-kserve-provision-location\") pod \"4b02bc96-301a-4de1-87fb-96d881dc23d4\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " Apr 16 17:10:22.063983 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.063974 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnftv\" (UniqueName: \"kubernetes.io/projected/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kube-api-access-mnftv\") pod \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " Apr 16 17:10:22.064187 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.064018 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-dshm\") pod \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " Apr 16 17:10:22.064187 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.064044 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-home\") pod \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " Apr 16 17:10:22.064187 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.064084 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kserve-provision-location\") pod \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " Apr 16 17:10:22.064187 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.064122 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-home\") pod \"4b02bc96-301a-4de1-87fb-96d881dc23d4\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " Apr 16 17:10:22.064187 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.064156 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-model-cache\") pod \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\" (UID: \"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2\") " Apr 16 17:10:22.064432 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.064194 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-model-cache\") pod \"4b02bc96-301a-4de1-87fb-96d881dc23d4\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " Apr 16 17:10:22.064432 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.064251 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b02bc96-301a-4de1-87fb-96d881dc23d4-tls-certs\") pod \"4b02bc96-301a-4de1-87fb-96d881dc23d4\" (UID: \"4b02bc96-301a-4de1-87fb-96d881dc23d4\") " Apr 16 17:10:22.065103 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.064836 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-home" (OuterVolumeSpecName: "home") pod "4b02bc96-301a-4de1-87fb-96d881dc23d4" (UID: "4b02bc96-301a-4de1-87fb-96d881dc23d4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:22.065653 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.065414 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-home" (OuterVolumeSpecName: "home") pod "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" (UID: "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:22.066916 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.066579 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b02bc96-301a-4de1-87fb-96d881dc23d4-kube-api-access-l7c9q" (OuterVolumeSpecName: "kube-api-access-l7c9q") pod "4b02bc96-301a-4de1-87fb-96d881dc23d4" (UID: "4b02bc96-301a-4de1-87fb-96d881dc23d4"). InnerVolumeSpecName "kube-api-access-l7c9q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:10:22.066916 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.066834 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-dshm" (OuterVolumeSpecName: "dshm") pod "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" (UID: "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:22.066916 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.066859 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-model-cache" (OuterVolumeSpecName: "model-cache") pod "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" (UID: "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:22.067188 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.066958 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-dshm" (OuterVolumeSpecName: "dshm") pod "4b02bc96-301a-4de1-87fb-96d881dc23d4" (UID: "4b02bc96-301a-4de1-87fb-96d881dc23d4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:22.067244 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.067180 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" (UID: "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:10:22.067295 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.067249 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-model-cache" (OuterVolumeSpecName: "model-cache") pod "4b02bc96-301a-4de1-87fb-96d881dc23d4" (UID: "4b02bc96-301a-4de1-87fb-96d881dc23d4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:22.067572 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.067548 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b02bc96-301a-4de1-87fb-96d881dc23d4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4b02bc96-301a-4de1-87fb-96d881dc23d4" (UID: "4b02bc96-301a-4de1-87fb-96d881dc23d4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:10:22.068512 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.068493 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kube-api-access-mnftv" (OuterVolumeSpecName: "kube-api-access-mnftv") pod "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" (UID: "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2"). InnerVolumeSpecName "kube-api-access-mnftv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:10:22.080127 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.080098 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4b02bc96-301a-4de1-87fb-96d881dc23d4" (UID: "4b02bc96-301a-4de1-87fb-96d881dc23d4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:22.085361 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.085343 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l_4b02bc96-301a-4de1-87fb-96d881dc23d4/main/0.log" Apr 16 17:10:22.085999 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.085978 2572 generic.go:358] "Generic (PLEG): container finished" podID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerID="00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc" exitCode=137 Apr 16 17:10:22.085999 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.085997 2572 generic.go:358] "Generic (PLEG): container finished" podID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerID="88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054" exitCode=0 Apr 16 17:10:22.086187 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.086057 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" Apr 16 17:10:22.086187 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.086080 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" event={"ID":"4b02bc96-301a-4de1-87fb-96d881dc23d4","Type":"ContainerDied","Data":"00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc"} Apr 16 17:10:22.086187 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.086124 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" event={"ID":"4b02bc96-301a-4de1-87fb-96d881dc23d4","Type":"ContainerDied","Data":"88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054"} Apr 16 17:10:22.086187 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.086153 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l" event={"ID":"4b02bc96-301a-4de1-87fb-96d881dc23d4","Type":"ContainerDied","Data":"f3c2b97b8490b13fef5fb7f15dcf6d4adaffec062e8d852abb8643a911d0fa92"} Apr 16 17:10:22.086187 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.086173 2572 scope.go:117] "RemoveContainer" containerID="00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc" Apr 16 17:10:22.088169 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.088136 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" event={"ID":"f13b4cc2-1661-4f97-af96-93ea0d79f1af","Type":"ContainerStarted","Data":"ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff"} Apr 16 17:10:22.090536 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.090360 2572 generic.go:358] "Generic (PLEG): container finished" podID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerID="a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6" exitCode=137 Apr 16 17:10:22.090536 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.090432 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" Apr 16 17:10:22.090536 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.090490 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" event={"ID":"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2","Type":"ContainerDied","Data":"a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6"} Apr 16 17:10:22.090536 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.090515 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27" event={"ID":"f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2","Type":"ContainerDied","Data":"de321fc8b2ac437f2e79c8d95b652bd15067c782f55af43d398830b23bdd92fb"} Apr 16 17:10:22.101271 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.101248 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" (UID: "f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:22.121400 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.121375 2572 scope.go:117] "RemoveContainer" containerID="3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24" Apr 16 17:10:22.139448 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.139418 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l"] Apr 16 17:10:22.143583 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.143483 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5785784b6bjlf5l"] Apr 16 17:10:22.148663 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.148400 2572 scope.go:117] "RemoveContainer" containerID="88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054" Apr 16 17:10:22.160812 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.160788 2572 scope.go:117] "RemoveContainer" containerID="00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc" Apr 16 17:10:22.161144 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:10:22.161123 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc\": container with ID starting with 00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc not found: ID does not exist" containerID="00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc" Apr 16 17:10:22.161229 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.161157 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc"} err="failed to get container status \"00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc\": rpc error: code = NotFound desc = could not find container \"00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc\": container with ID starting with 00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc not found: ID does not exist" Apr 16 17:10:22.161229 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.161182 2572 scope.go:117] "RemoveContainer" containerID="3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24" Apr 16 17:10:22.161515 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:10:22.161488 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24\": container with ID starting with 3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24 not found: ID does not exist" containerID="3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24" Apr 16 17:10:22.161578 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.161528 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24"} err="failed to get container status \"3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24\": rpc error: code = NotFound desc = could not find container \"3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24\": container with ID starting with 3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24 not found: ID does not exist" Apr 16 17:10:22.161578 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.161552 2572 scope.go:117] "RemoveContainer" containerID="88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054" Apr 16 17:10:22.161842 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:10:22.161821 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054\": container with ID starting with 88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054 not found: ID does not exist" containerID="88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054" Apr 16 17:10:22.161898 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.161852 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054"} err="failed to get container status \"88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054\": rpc error: code = NotFound desc = could not find container \"88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054\": container with ID starting with 88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054 not found: ID does not exist" Apr 16 17:10:22.161898 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.161874 2572 scope.go:117] "RemoveContainer" containerID="00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc" Apr 16 17:10:22.162165 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.162136 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc"} err="failed to get container status \"00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc\": rpc error: code = NotFound desc = could not find container \"00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc\": container with ID starting with 00cccf8ece717b1d6e597635e2c5d8f20f993347aa176a5434f362c0ce7b1cdc not found: ID does not exist" Apr 16 17:10:22.162165 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.162163 2572 scope.go:117] "RemoveContainer" containerID="3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24" Apr 16 17:10:22.162415 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.162398 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24"} err="failed to get container status \"3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24\": rpc error: code = NotFound desc = could not find container \"3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24\": container with ID starting with 3244058ea660fc7ff8306f7c3e02902d5874d5c88f009008143e363c8090dc24 not found: ID does not exist" Apr 16 17:10:22.162415 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.162415 2572 scope.go:117] "RemoveContainer" containerID="88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054" Apr 16 17:10:22.162702 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.162681 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054"} err="failed to get container status \"88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054\": rpc error: code = NotFound desc = could not find container \"88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054\": container with ID starting with 88bcf30775972ba3e9220029b2415ce1a966faefa909e3e82e2b40df4e9ee054 not found: ID does not exist" Apr 16 17:10:22.162794 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.162756 2572 scope.go:117] "RemoveContainer" containerID="a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6" Apr 16 17:10:22.165171 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165151 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l7c9q\" (UniqueName: \"kubernetes.io/projected/4b02bc96-301a-4de1-87fb-96d881dc23d4-kube-api-access-l7c9q\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.165171 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165171 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.165333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165181 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.165333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165190 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.165333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165199 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mnftv\" (UniqueName: \"kubernetes.io/projected/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kube-api-access-mnftv\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.165333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165207 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.165333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165215 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.165333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165222 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.165333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165230 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.165333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165243 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.165333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165254 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b02bc96-301a-4de1-87fb-96d881dc23d4-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.165333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.165268 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b02bc96-301a-4de1-87fb-96d881dc23d4-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:22.186868 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.186835 2572 scope.go:117] "RemoveContainer" containerID="951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b" Apr 16 17:10:22.230418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.230394 2572 scope.go:117] "RemoveContainer" containerID="a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6" Apr 16 17:10:22.230764 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:10:22.230730 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6\": container with ID starting with a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6 not found: ID does not exist" containerID="a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6" Apr 16 17:10:22.230878 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.230767 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6"} err="failed to get container status \"a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6\": rpc error: code = NotFound desc = could not find container \"a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6\": container with ID starting with a8cc358a12b5e42c92eeb54674ca2addd3f38b05924613b49cf0583623ab37a6 not found: ID does not exist" Apr 16 17:10:22.230878 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.230793 2572 scope.go:117] "RemoveContainer" containerID="951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b" Apr 16 17:10:22.231169 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:10:22.231138 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b\": container with ID starting with 951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b not found: ID does not exist" containerID="951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b" Apr 16 17:10:22.231268 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.231170 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b"} err="failed to get container status \"951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b\": rpc error: code = NotFound desc = could not find container \"951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b\": container with ID starting with 951f9bdd21a2ee2c347816842557f4d804384fc7d84157186c87da10d9879d8b not found: ID does not exist" Apr 16 17:10:22.423439 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.423405 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27"] Apr 16 17:10:22.428549 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:22.428522 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8fjng27"] Apr 16 17:10:23.596578 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:23.596533 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" path="/var/lib/kubelet/pods/4b02bc96-301a-4de1-87fb-96d881dc23d4/volumes" Apr 16 17:10:23.597113 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:23.597093 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" path="/var/lib/kubelet/pods/f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2/volumes" Apr 16 17:10:25.110128 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:25.110094 2572 generic.go:358] "Generic (PLEG): container finished" podID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerID="bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0" exitCode=0 Apr 16 17:10:25.110593 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:25.110168 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" event={"ID":"23f8a34b-3c7e-4539-a8b5-43fac58d50e8","Type":"ContainerDied","Data":"bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0"} Apr 16 17:10:26.115344 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:26.115306 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" event={"ID":"23f8a34b-3c7e-4539-a8b5-43fac58d50e8","Type":"ContainerStarted","Data":"aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442"} Apr 16 17:10:26.116848 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:26.116822 2572 generic.go:358] "Generic (PLEG): container finished" podID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerID="ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff" exitCode=0 Apr 16 17:10:26.116973 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:26.116868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" event={"ID":"f13b4cc2-1661-4f97-af96-93ea0d79f1af","Type":"ContainerDied","Data":"ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff"} Apr 16 17:10:26.139296 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:26.139246 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podStartSLOduration=7.139228796 podStartE2EDuration="7.139228796s" podCreationTimestamp="2026-04-16 17:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:10:26.135167546 +0000 UTC m=+1343.303878908" watchObservedRunningTime="2026-04-16 17:10:26.139228796 +0000 UTC m=+1343.307940158" Apr 16 17:10:27.123200 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:27.123155 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" event={"ID":"f13b4cc2-1661-4f97-af96-93ea0d79f1af","Type":"ContainerStarted","Data":"a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1"} Apr 16 17:10:27.151167 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:27.151105 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podStartSLOduration=8.151087637 podStartE2EDuration="8.151087637s" podCreationTimestamp="2026-04-16 17:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:10:27.146715016 +0000 UTC m=+1344.315426379" watchObservedRunningTime="2026-04-16 17:10:27.151087637 +0000 UTC m=+1344.319799000" Apr 16 17:10:30.120049 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:30.120020 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:30.120515 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:30.120079 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:30.121634 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:30.121586 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:10:30.126907 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:30.126882 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:30.127048 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:30.126922 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:10:30.128233 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:30.128202 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:10:30.133106 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:30.133088 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:10:35.268867 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.268832 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 17:10:35.269272 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269242 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="storage-initializer" Apr 16 17:10:35.269272 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269255 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="storage-initializer" Apr 16 17:10:35.269347 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269272 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" Apr 16 17:10:35.269347 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269281 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" Apr 16 17:10:35.269347 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269295 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="llm-d-routing-sidecar" Apr 16 17:10:35.269347 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269302 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="llm-d-routing-sidecar" Apr 16 17:10:35.269347 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269312 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" Apr 16 17:10:35.269347 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269317 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" Apr 16 17:10:35.269347 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269324 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="storage-initializer" Apr 16 17:10:35.269347 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269330 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="storage-initializer" Apr 16 17:10:35.269595 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269387 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f161f8a8-c5d6-497d-bb8b-c1b73ceb89e2" containerName="main" Apr 16 17:10:35.269595 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269399 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="llm-d-routing-sidecar" Apr 16 17:10:35.269595 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.269406 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b02bc96-301a-4de1-87fb-96d881dc23d4" containerName="main" Apr 16 17:10:35.305864 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.305832 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 17:10:35.306026 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.305985 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.308909 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.308888 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 17:10:35.309030 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.308929 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-rv5zk\"" Apr 16 17:10:35.387759 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.387725 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.387966 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.387775 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8gqr\" (UniqueName: \"kubernetes.io/projected/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kube-api-access-j8gqr\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.387966 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.387830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.387966 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.387875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.388161 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.387976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.388161 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.388030 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.488915 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.488888 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.489086 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.488935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.489086 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.488964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.489086 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.488990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8gqr\" (UniqueName: \"kubernetes.io/projected/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kube-api-access-j8gqr\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.489086 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.489029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.489086 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.489054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.489363 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.489274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.489552 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.489531 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.489688 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.489584 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.491380 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.491353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.491789 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.491768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.497692 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.497665 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8gqr\" (UniqueName: \"kubernetes.io/projected/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kube-api-access-j8gqr\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.618710 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.618673 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:35.793243 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:35.793196 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 17:10:36.168108 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:36.167997 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"fbcd0ec2-e2ae-417c-aba4-086eb9fba102","Type":"ContainerStarted","Data":"0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458"} Apr 16 17:10:36.168108 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:36.168039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"fbcd0ec2-e2ae-417c-aba4-086eb9fba102","Type":"ContainerStarted","Data":"a252f53e77e144a8cf46a0245b964f94c2e4d40785424253cc3b29c632a5fa06"} Apr 16 17:10:40.120225 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:40.120181 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:10:40.127395 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:40.127360 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:10:40.229741 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:40.229710 2572 generic.go:358] "Generic (PLEG): container finished" podID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerID="0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458" exitCode=0 Apr 16 17:10:40.229904 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:40.229790 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"fbcd0ec2-e2ae-417c-aba4-086eb9fba102","Type":"ContainerDied","Data":"0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458"} Apr 16 17:10:41.236367 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:41.236325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"fbcd0ec2-e2ae-417c-aba4-086eb9fba102","Type":"ContainerStarted","Data":"866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454"} Apr 16 17:10:41.259163 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:41.259102 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.259058442 podStartE2EDuration="6.259058442s" podCreationTimestamp="2026-04-16 17:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:10:41.257276219 +0000 UTC m=+1358.425987582" watchObservedRunningTime="2026-04-16 17:10:41.259058442 +0000 UTC m=+1358.427769807" Apr 16 17:10:45.619696 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:45.619657 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:10:45.621655 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:45.621627 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:10:50.120296 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:50.120215 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:10:50.127139 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:50.127101 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:10:51.883583 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.883556 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22_8bd093de-7558-445a-af48-f9c7a9d6b76c/main/0.log" Apr 16 17:10:51.884033 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.884016 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:10:51.945016 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.944990 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-model-cache\") pod \"8bd093de-7558-445a-af48-f9c7a9d6b76c\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " Apr 16 17:10:51.945197 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.945040 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-kserve-provision-location\") pod \"8bd093de-7558-445a-af48-f9c7a9d6b76c\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " Apr 16 17:10:51.945197 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.945079 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd093de-7558-445a-af48-f9c7a9d6b76c-tls-certs\") pod \"8bd093de-7558-445a-af48-f9c7a9d6b76c\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " Apr 16 17:10:51.945197 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.945126 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-dshm\") pod \"8bd093de-7558-445a-af48-f9c7a9d6b76c\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " Apr 16 17:10:51.945197 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.945150 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-home\") pod \"8bd093de-7558-445a-af48-f9c7a9d6b76c\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " Apr 16 17:10:51.945446 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.945182 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm68x\" (UniqueName: \"kubernetes.io/projected/8bd093de-7558-445a-af48-f9c7a9d6b76c-kube-api-access-qm68x\") pod \"8bd093de-7558-445a-af48-f9c7a9d6b76c\" (UID: \"8bd093de-7558-445a-af48-f9c7a9d6b76c\") " Apr 16 17:10:51.945446 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.945310 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-model-cache" (OuterVolumeSpecName: "model-cache") pod "8bd093de-7558-445a-af48-f9c7a9d6b76c" (UID: "8bd093de-7558-445a-af48-f9c7a9d6b76c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:51.945678 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.945634 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:51.946215 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.946184 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-home" (OuterVolumeSpecName: "home") pod "8bd093de-7558-445a-af48-f9c7a9d6b76c" (UID: "8bd093de-7558-445a-af48-f9c7a9d6b76c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:51.947730 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.947684 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd093de-7558-445a-af48-f9c7a9d6b76c-kube-api-access-qm68x" (OuterVolumeSpecName: "kube-api-access-qm68x") pod "8bd093de-7558-445a-af48-f9c7a9d6b76c" (UID: "8bd093de-7558-445a-af48-f9c7a9d6b76c"). InnerVolumeSpecName "kube-api-access-qm68x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:10:51.947939 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.947906 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-dshm" (OuterVolumeSpecName: "dshm") pod "8bd093de-7558-445a-af48-f9c7a9d6b76c" (UID: "8bd093de-7558-445a-af48-f9c7a9d6b76c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:51.948222 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.948191 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd093de-7558-445a-af48-f9c7a9d6b76c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8bd093de-7558-445a-af48-f9c7a9d6b76c" (UID: "8bd093de-7558-445a-af48-f9c7a9d6b76c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:10:51.985653 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:51.985605 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8bd093de-7558-445a-af48-f9c7a9d6b76c" (UID: "8bd093de-7558-445a-af48-f9c7a9d6b76c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:10:52.046763 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.046673 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:52.046763 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.046718 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd093de-7558-445a-af48-f9c7a9d6b76c-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:52.046763 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.046734 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:52.046763 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.046748 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bd093de-7558-445a-af48-f9c7a9d6b76c-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:52.046763 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.046762 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qm68x\" (UniqueName: \"kubernetes.io/projected/8bd093de-7558-445a-af48-f9c7a9d6b76c-kube-api-access-qm68x\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:10:52.295978 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.295944 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22_8bd093de-7558-445a-af48-f9c7a9d6b76c/main/0.log" Apr 16 17:10:52.296417 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.296387 2572 generic.go:358] "Generic (PLEG): container finished" podID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerID="bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3" exitCode=137 Apr 16 17:10:52.296532 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.296475 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" event={"ID":"8bd093de-7558-445a-af48-f9c7a9d6b76c","Type":"ContainerDied","Data":"bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3"} Apr 16 17:10:52.296532 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.296488 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" Apr 16 17:10:52.296532 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.296521 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22" event={"ID":"8bd093de-7558-445a-af48-f9c7a9d6b76c","Type":"ContainerDied","Data":"881e15999b4f4fe32eb186cdee45f548bb0fece2f3ac409a2db1bd22abf0df22"} Apr 16 17:10:52.296685 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.296543 2572 scope.go:117] "RemoveContainer" containerID="bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3" Apr 16 17:10:52.320007 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.319981 2572 scope.go:117] "RemoveContainer" containerID="c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da" Apr 16 17:10:52.327834 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.327797 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22"] Apr 16 17:10:52.338732 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.338691 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-848f6f45d5fzz22"] Apr 16 17:10:52.358698 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.358674 2572 scope.go:117] "RemoveContainer" containerID="bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3" Apr 16 17:10:52.358994 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:10:52.358973 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3\": container with ID starting with bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3 not found: ID does not exist" containerID="bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3" Apr 16 17:10:52.359114 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.359003 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3"} err="failed to get container status \"bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3\": rpc error: code = NotFound desc = could not find container \"bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3\": container with ID starting with bed2740714cecd37defd298497e295534205ebbac65d091ea3e04154c3ed41c3 not found: ID does not exist" Apr 16 17:10:52.359114 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.359024 2572 scope.go:117] "RemoveContainer" containerID="c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da" Apr 16 17:10:52.359307 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:10:52.359290 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da\": container with ID starting with c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da not found: ID does not exist" containerID="c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da" Apr 16 17:10:52.359379 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:52.359310 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da"} err="failed to get container status \"c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da\": rpc error: code = NotFound desc = could not find container \"c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da\": container with ID starting with c12c05bb868d552399183847c74f02dc747d785c6e5ac103136b72f7fb1b41da not found: ID does not exist" Apr 16 17:10:53.591575 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:53.591538 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" path="/var/lib/kubelet/pods/8bd093de-7558-445a-af48-f9c7a9d6b76c/volumes" Apr 16 17:10:55.619407 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:10:55.619361 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:11:00.120291 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:00.120247 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:11:00.126915 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:00.126868 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:11:05.619487 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:05.619441 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:11:05.620223 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:05.620183 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:11:10.120512 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:10.120467 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:11:10.127357 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:10.127310 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:11:15.619584 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:15.619527 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:11:20.120020 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:20.119969 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:11:20.126924 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:20.126884 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:11:25.619050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:25.619002 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:11:30.120024 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:30.119980 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:11:30.127106 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:30.127057 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:11:35.619485 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:35.619435 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:11:40.119828 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:40.119791 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:11:40.126978 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:40.126947 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:11:45.619611 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:45.619566 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:11:50.120056 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:50.120002 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:11:50.126926 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:50.126889 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:11:55.620022 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:11:55.619983 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:12:00.120281 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:00.120242 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:12:00.126725 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:00.126694 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:12:05.619389 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:05.619347 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:12:10.120523 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:10.120474 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:12:10.126503 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:10.126473 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:12:15.619490 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:15.619443 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:12:20.120443 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:20.120391 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:12:20.126828 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:20.126796 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:12:25.619782 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:25.619739 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:12:30.119891 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:30.119847 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:12:30.127232 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:30.127201 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:12:35.619299 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:35.619258 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:12:40.119913 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:40.119870 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:12:40.126427 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:40.126395 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:12:45.619844 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:45.619804 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:12:50.120449 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:50.120387 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:12:50.126464 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:50.126427 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:12:55.619350 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:12:55.619306 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:13:00.119965 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:00.119921 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.62:8001/health\": dial tcp 10.133.0.62:8001: connect: connection refused" Apr 16 17:13:00.127141 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:00.127107 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.63:8000/health\": dial tcp 10.133.0.63:8000: connect: connection refused" Apr 16 17:13:03.605140 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:03.605110 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 17:13:03.608575 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:03.608557 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 17:13:05.619113 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:05.619058 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" probeResult="failure" output="Get \"https://10.133.0.64:8000/health\": dial tcp 10.133.0.64:8000: connect: connection refused" Apr 16 17:13:10.135209 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:10.135178 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:13:10.141868 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:10.141839 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:13:10.153226 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:10.153206 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:13:10.155470 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:10.155452 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:13:15.628845 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:15.628814 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:13:15.637401 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:15.637367 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:13:24.878940 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:24.878902 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz"] Apr 16 17:13:24.879943 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:24.879873 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" containerID="cri-o://aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442" gracePeriod=30 Apr 16 17:13:24.888548 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:24.888412 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl"] Apr 16 17:13:24.888957 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:24.888875 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" containerID="cri-o://a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1" gracePeriod=30 Apr 16 17:13:32.916047 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:32.916014 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld"] Apr 16 17:13:32.917476 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:32.917447 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" Apr 16 17:13:32.917476 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:32.917477 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" Apr 16 17:13:32.917636 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:32.917517 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="storage-initializer" Apr 16 17:13:32.917636 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:32.917527 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="storage-initializer" Apr 16 17:13:32.917636 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:32.917625 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bd093de-7558-445a-af48-f9c7a9d6b76c" containerName="main" Apr 16 17:13:32.921354 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:32.921332 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:32.924132 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:32.924108 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 17:13:32.931507 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:32.931125 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld"] Apr 16 17:13:33.004725 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.004700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.004884 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.004733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75cdbd21-24ab-4542-9601-3840e16e313d-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.004884 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.004758 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlv8t\" (UniqueName: \"kubernetes.io/projected/75cdbd21-24ab-4542-9601-3840e16e313d-kube-api-access-tlv8t\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.004884 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.004853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.005006 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.004919 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-home\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.005006 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.004944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.106028 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.105995 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.106189 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.106042 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.106240 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.106218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75cdbd21-24ab-4542-9601-3840e16e313d-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.106289 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.106259 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlv8t\" (UniqueName: \"kubernetes.io/projected/75cdbd21-24ab-4542-9601-3840e16e313d-kube-api-access-tlv8t\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.106350 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.106304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.106418 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.106400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-home\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.106602 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.106407 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.106602 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.106591 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.106781 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.106691 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-home\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.108506 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.108483 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.108737 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.108720 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75cdbd21-24ab-4542-9601-3840e16e313d-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.114291 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.114271 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlv8t\" (UniqueName: \"kubernetes.io/projected/75cdbd21-24ab-4542-9601-3840e16e313d-kube-api-access-tlv8t\") pod \"router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.235410 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.235342 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:33.361977 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.361952 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld"] Apr 16 17:13:33.363926 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:13:33.363898 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75cdbd21_24ab_4542_9601_3840e16e313d.slice/crio-1ae17f584d0cdae1a7a1152908bc373ac7400f2fbb3d5853009e5ac1b33f5335 WatchSource:0}: Error finding container 1ae17f584d0cdae1a7a1152908bc373ac7400f2fbb3d5853009e5ac1b33f5335: Status 404 returned error can't find the container with id 1ae17f584d0cdae1a7a1152908bc373ac7400f2fbb3d5853009e5ac1b33f5335 Apr 16 17:13:33.965095 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.965041 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" event={"ID":"75cdbd21-24ab-4542-9601-3840e16e313d","Type":"ContainerStarted","Data":"6a0f721a682e3f3a1bea400b4ee5ae0a5cc3e7aa380f509a28525be18d5a8998"} Apr 16 17:13:33.965095 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:33.965092 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" event={"ID":"75cdbd21-24ab-4542-9601-3840e16e313d","Type":"ContainerStarted","Data":"1ae17f584d0cdae1a7a1152908bc373ac7400f2fbb3d5853009e5ac1b33f5335"} Apr 16 17:13:37.980484 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:37.980443 2572 generic.go:358] "Generic (PLEG): container finished" podID="75cdbd21-24ab-4542-9601-3840e16e313d" containerID="6a0f721a682e3f3a1bea400b4ee5ae0a5cc3e7aa380f509a28525be18d5a8998" exitCode=0 Apr 16 17:13:37.980835 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:37.980515 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" event={"ID":"75cdbd21-24ab-4542-9601-3840e16e313d","Type":"ContainerDied","Data":"6a0f721a682e3f3a1bea400b4ee5ae0a5cc3e7aa380f509a28525be18d5a8998"} Apr 16 17:13:38.986097 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:38.986041 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" event={"ID":"75cdbd21-24ab-4542-9601-3840e16e313d","Type":"ContainerStarted","Data":"9ad34cdf99b91747c501ab1db0908eb84581cfb0963fdd02796b04794da8d8a6"} Apr 16 17:13:39.007354 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:39.007294 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" podStartSLOduration=7.007275707 podStartE2EDuration="7.007275707s" podCreationTimestamp="2026-04-16 17:13:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:13:39.005562804 +0000 UTC m=+1536.174274167" watchObservedRunningTime="2026-04-16 17:13:39.007275707 +0000 UTC m=+1536.175987071" Apr 16 17:13:43.236472 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:43.236432 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:43.236869 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:43.236486 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:13:43.237947 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:43.237923 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.65:8000/health\": dial tcp 10.133.0.65:8000: connect: connection refused" Apr 16 17:13:44.022795 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.022760 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn"] Apr 16 17:13:44.028710 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.028688 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.032855 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.032830 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 17:13:44.033331 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.033312 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-bv8xk\"" Apr 16 17:13:44.035778 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.035747 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn"] Apr 16 17:13:44.206560 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.206525 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.206740 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.206572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.206740 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.206616 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szlgr\" (UniqueName: \"kubernetes.io/projected/86822240-f0d7-4b1c-9c15-c87383363f81-kube-api-access-szlgr\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.206740 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.206668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86822240-f0d7-4b1c-9c15-c87383363f81-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.206740 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.206698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.206740 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.206724 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.307539 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.307461 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86822240-f0d7-4b1c-9c15-c87383363f81-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.307539 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.307499 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.307539 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.307520 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.308049 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.307584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.308049 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.307619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.308049 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.307646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szlgr\" (UniqueName: \"kubernetes.io/projected/86822240-f0d7-4b1c-9c15-c87383363f81-kube-api-access-szlgr\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.308049 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.307945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.308049 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.308026 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.308270 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.308032 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.308270 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.308109 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.310183 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.310150 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86822240-f0d7-4b1c-9c15-c87383363f81-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.316122 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.316054 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szlgr\" (UniqueName: \"kubernetes.io/projected/86822240-f0d7-4b1c-9c15-c87383363f81-kube-api-access-szlgr\") pod \"scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.340924 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.340897 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:44.677526 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:44.677494 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn"] Apr 16 17:13:44.678649 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:13:44.678619 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86822240_f0d7_4b1c_9c15_c87383363f81.slice/crio-3811ecc0712fb0404d66016618c55f49b874f5d9521249ef3f1d2c156c3a4a72 WatchSource:0}: Error finding container 3811ecc0712fb0404d66016618c55f49b874f5d9521249ef3f1d2c156c3a4a72: Status 404 returned error can't find the container with id 3811ecc0712fb0404d66016618c55f49b874f5d9521249ef3f1d2c156c3a4a72 Apr 16 17:13:45.015403 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:45.015318 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" event={"ID":"86822240-f0d7-4b1c-9c15-c87383363f81","Type":"ContainerStarted","Data":"d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc"} Apr 16 17:13:45.015403 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:45.015358 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" event={"ID":"86822240-f0d7-4b1c-9c15-c87383363f81","Type":"ContainerStarted","Data":"3811ecc0712fb0404d66016618c55f49b874f5d9521249ef3f1d2c156c3a4a72"} Apr 16 17:13:46.014774 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:46.014736 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 17:13:46.015210 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:46.015113 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" containerID="cri-o://866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454" gracePeriod=30 Apr 16 17:13:46.021779 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:46.021623 2572 generic.go:358] "Generic (PLEG): container finished" podID="86822240-f0d7-4b1c-9c15-c87383363f81" containerID="d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc" exitCode=0 Apr 16 17:13:46.021779 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:46.021700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" event={"ID":"86822240-f0d7-4b1c-9c15-c87383363f81","Type":"ContainerDied","Data":"d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc"} Apr 16 17:13:46.915880 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:46.915849 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:13:47.028188 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.028095 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" event={"ID":"86822240-f0d7-4b1c-9c15-c87383363f81","Type":"ContainerStarted","Data":"f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681"} Apr 16 17:13:47.028188 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.028138 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" event={"ID":"86822240-f0d7-4b1c-9c15-c87383363f81","Type":"ContainerStarted","Data":"1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14"} Apr 16 17:13:47.028662 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.028200 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:47.029832 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.029805 2572 generic.go:358] "Generic (PLEG): container finished" podID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerID="866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454" exitCode=0 Apr 16 17:13:47.029937 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.029847 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"fbcd0ec2-e2ae-417c-aba4-086eb9fba102","Type":"ContainerDied","Data":"866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454"} Apr 16 17:13:47.029937 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.029858 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 17:13:47.029937 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.029872 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"fbcd0ec2-e2ae-417c-aba4-086eb9fba102","Type":"ContainerDied","Data":"a252f53e77e144a8cf46a0245b964f94c2e4d40785424253cc3b29c632a5fa06"} Apr 16 17:13:47.029937 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.029893 2572 scope.go:117] "RemoveContainer" containerID="866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454" Apr 16 17:13:47.035742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.034971 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-dshm\") pod \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " Apr 16 17:13:47.035742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.035043 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8gqr\" (UniqueName: \"kubernetes.io/projected/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kube-api-access-j8gqr\") pod \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " Apr 16 17:13:47.035742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.035100 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kserve-provision-location\") pod \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " Apr 16 17:13:47.035742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.035129 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-home\") pod \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " Apr 16 17:13:47.035742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.035172 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-model-cache\") pod \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " Apr 16 17:13:47.035742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.035269 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-tls-certs\") pod \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\" (UID: \"fbcd0ec2-e2ae-417c-aba4-086eb9fba102\") " Apr 16 17:13:47.035742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.035696 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-home" (OuterVolumeSpecName: "home") pod "fbcd0ec2-e2ae-417c-aba4-086eb9fba102" (UID: "fbcd0ec2-e2ae-417c-aba4-086eb9fba102"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:47.036316 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.036226 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-model-cache" (OuterVolumeSpecName: "model-cache") pod "fbcd0ec2-e2ae-417c-aba4-086eb9fba102" (UID: "fbcd0ec2-e2ae-417c-aba4-086eb9fba102"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:47.038624 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.038604 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-dshm" (OuterVolumeSpecName: "dshm") pod "fbcd0ec2-e2ae-417c-aba4-086eb9fba102" (UID: "fbcd0ec2-e2ae-417c-aba4-086eb9fba102"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:47.038721 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.038637 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kube-api-access-j8gqr" (OuterVolumeSpecName: "kube-api-access-j8gqr") pod "fbcd0ec2-e2ae-417c-aba4-086eb9fba102" (UID: "fbcd0ec2-e2ae-417c-aba4-086eb9fba102"). InnerVolumeSpecName "kube-api-access-j8gqr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:13:47.039054 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.039032 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fbcd0ec2-e2ae-417c-aba4-086eb9fba102" (UID: "fbcd0ec2-e2ae-417c-aba4-086eb9fba102"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:13:47.060636 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.060614 2572 scope.go:117] "RemoveContainer" containerID="0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458" Apr 16 17:13:47.063358 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.063044 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" podStartSLOduration=3.06302806 podStartE2EDuration="3.06302806s" podCreationTimestamp="2026-04-16 17:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:13:47.061777524 +0000 UTC m=+1544.230488887" watchObservedRunningTime="2026-04-16 17:13:47.06302806 +0000 UTC m=+1544.231739425" Apr 16 17:13:47.070107 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.070058 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fbcd0ec2-e2ae-417c-aba4-086eb9fba102" (UID: "fbcd0ec2-e2ae-417c-aba4-086eb9fba102"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:47.109814 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.109788 2572 scope.go:117] "RemoveContainer" containerID="866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454" Apr 16 17:13:47.110163 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:13:47.110143 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454\": container with ID starting with 866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454 not found: ID does not exist" containerID="866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454" Apr 16 17:13:47.110271 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.110169 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454"} err="failed to get container status \"866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454\": rpc error: code = NotFound desc = could not find container \"866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454\": container with ID starting with 866d18da5952d8da6cec082bd75e88c2f598bcedc24f7c323eeee634d86f4454 not found: ID does not exist" Apr 16 17:13:47.110271 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.110189 2572 scope.go:117] "RemoveContainer" containerID="0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458" Apr 16 17:13:47.110484 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:13:47.110456 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458\": container with ID starting with 0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458 not found: ID does not exist" containerID="0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458" Apr 16 17:13:47.110550 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.110492 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458"} err="failed to get container status \"0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458\": rpc error: code = NotFound desc = could not find container \"0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458\": container with ID starting with 0e006b28974ad8ace9b8151a95003930bdfee5f48519a4e0c8db1362fe5c4458 not found: ID does not exist" Apr 16 17:13:47.136733 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.136700 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.136880 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.136736 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8gqr\" (UniqueName: \"kubernetes.io/projected/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kube-api-access-j8gqr\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.136880 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.136755 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.136880 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.136768 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.136880 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.136783 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.136880 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.136796 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fbcd0ec2-e2ae-417c-aba4-086eb9fba102-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.355886 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.355854 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 17:13:47.361263 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.361237 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 17:13:47.591183 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:47.591097 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" path="/var/lib/kubelet/pods/fbcd0ec2-e2ae-417c-aba4-086eb9fba102/volumes" Apr 16 17:13:53.236080 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:53.236031 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.65:8000/health\": dial tcp 10.133.0.65:8000: connect: connection refused" Apr 16 17:13:54.341879 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:54.341838 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:54.342406 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:54.342001 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:54.344722 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:54.344698 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:54.889926 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:54.889855 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="llm-d-routing-sidecar" containerID="cri-o://c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab" gracePeriod=2 Apr 16 17:13:55.067496 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.067466 2572 generic.go:358] "Generic (PLEG): container finished" podID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerID="c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab" exitCode=0 Apr 16 17:13:55.067598 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.067544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" event={"ID":"f13b4cc2-1661-4f97-af96-93ea0d79f1af","Type":"ContainerDied","Data":"c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab"} Apr 16 17:13:55.069286 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.069269 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:13:55.291806 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.291784 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:13:55.303781 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.303756 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-home\") pod \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " Apr 16 17:13:55.303927 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.303791 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kserve-provision-location\") pod \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " Apr 16 17:13:55.303927 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.303830 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-tls-certs\") pod \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " Apr 16 17:13:55.303927 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.303851 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnqhd\" (UniqueName: \"kubernetes.io/projected/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kube-api-access-rnqhd\") pod \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " Apr 16 17:13:55.303927 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.303867 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-dshm\") pod \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " Apr 16 17:13:55.304165 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.304008 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-model-cache\") pod \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\" (UID: \"23f8a34b-3c7e-4539-a8b5-43fac58d50e8\") " Apr 16 17:13:55.304165 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.304139 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-home" (OuterVolumeSpecName: "home") pod "23f8a34b-3c7e-4539-a8b5-43fac58d50e8" (UID: "23f8a34b-3c7e-4539-a8b5-43fac58d50e8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:55.304571 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.304426 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:55.304571 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.304537 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-model-cache" (OuterVolumeSpecName: "model-cache") pod "23f8a34b-3c7e-4539-a8b5-43fac58d50e8" (UID: "23f8a34b-3c7e-4539-a8b5-43fac58d50e8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:55.306058 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.305990 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "23f8a34b-3c7e-4539-a8b5-43fac58d50e8" (UID: "23f8a34b-3c7e-4539-a8b5-43fac58d50e8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:13:55.306702 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.306680 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-dshm" (OuterVolumeSpecName: "dshm") pod "23f8a34b-3c7e-4539-a8b5-43fac58d50e8" (UID: "23f8a34b-3c7e-4539-a8b5-43fac58d50e8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:55.306823 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.306789 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kube-api-access-rnqhd" (OuterVolumeSpecName: "kube-api-access-rnqhd") pod "23f8a34b-3c7e-4539-a8b5-43fac58d50e8" (UID: "23f8a34b-3c7e-4539-a8b5-43fac58d50e8"). InnerVolumeSpecName "kube-api-access-rnqhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:13:55.318464 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.318442 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-855cffc847-fzvhl_f13b4cc2-1661-4f97-af96-93ea0d79f1af/main/0.log" Apr 16 17:13:55.319155 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.319137 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:13:55.323908 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.323884 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "23f8a34b-3c7e-4539-a8b5-43fac58d50e8" (UID: "23f8a34b-3c7e-4539-a8b5-43fac58d50e8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:55.405601 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.405496 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-dshm\") pod \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " Apr 16 17:13:55.405601 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.405534 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kserve-provision-location\") pod \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " Apr 16 17:13:55.405601 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.405572 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-home\") pod \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " Apr 16 17:13:55.405601 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.405591 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f13b4cc2-1661-4f97-af96-93ea0d79f1af-tls-certs\") pod \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " Apr 16 17:13:55.405601 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.405607 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h7j8\" (UniqueName: \"kubernetes.io/projected/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kube-api-access-7h7j8\") pod \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " Apr 16 17:13:55.406329 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.405655 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-model-cache\") pod \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\" (UID: \"f13b4cc2-1661-4f97-af96-93ea0d79f1af\") " Apr 16 17:13:55.406329 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.405873 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:55.406329 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.405892 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:55.406329 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.405907 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:55.406329 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.405920 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rnqhd\" (UniqueName: \"kubernetes.io/projected/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-kube-api-access-rnqhd\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:55.406329 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.405934 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/23f8a34b-3c7e-4539-a8b5-43fac58d50e8-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:55.406329 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.406126 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-model-cache" (OuterVolumeSpecName: "model-cache") pod "f13b4cc2-1661-4f97-af96-93ea0d79f1af" (UID: "f13b4cc2-1661-4f97-af96-93ea0d79f1af"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:55.406690 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.406363 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-home" (OuterVolumeSpecName: "home") pod "f13b4cc2-1661-4f97-af96-93ea0d79f1af" (UID: "f13b4cc2-1661-4f97-af96-93ea0d79f1af"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:55.407863 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.407832 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kube-api-access-7h7j8" (OuterVolumeSpecName: "kube-api-access-7h7j8") pod "f13b4cc2-1661-4f97-af96-93ea0d79f1af" (UID: "f13b4cc2-1661-4f97-af96-93ea0d79f1af"). InnerVolumeSpecName "kube-api-access-7h7j8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:13:55.408001 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.407980 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-dshm" (OuterVolumeSpecName: "dshm") pod "f13b4cc2-1661-4f97-af96-93ea0d79f1af" (UID: "f13b4cc2-1661-4f97-af96-93ea0d79f1af"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:55.408082 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.407981 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13b4cc2-1661-4f97-af96-93ea0d79f1af-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f13b4cc2-1661-4f97-af96-93ea0d79f1af" (UID: "f13b4cc2-1661-4f97-af96-93ea0d79f1af"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:13:55.466394 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.466343 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f13b4cc2-1661-4f97-af96-93ea0d79f1af" (UID: "f13b4cc2-1661-4f97-af96-93ea0d79f1af"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:55.506610 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.506582 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:55.506610 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.506608 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f13b4cc2-1661-4f97-af96-93ea0d79f1af-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:55.506753 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.506621 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7h7j8\" (UniqueName: \"kubernetes.io/projected/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kube-api-access-7h7j8\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:55.506753 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.506632 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:55.506753 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.506641 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:55.506753 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:55.506649 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f13b4cc2-1661-4f97-af96-93ea0d79f1af-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:13:56.073357 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.073317 2572 generic.go:358] "Generic (PLEG): container finished" podID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerID="aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442" exitCode=137 Apr 16 17:13:56.073523 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.073380 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" Apr 16 17:13:56.073523 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.073393 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" event={"ID":"23f8a34b-3c7e-4539-a8b5-43fac58d50e8","Type":"ContainerDied","Data":"aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442"} Apr 16 17:13:56.073523 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.073426 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz" event={"ID":"23f8a34b-3c7e-4539-a8b5-43fac58d50e8","Type":"ContainerDied","Data":"4eb8a13e45b2ce6112ae03a10bacc345c9c969fc7bd135478728f1873f3fae8f"} Apr 16 17:13:56.073523 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.073442 2572 scope.go:117] "RemoveContainer" containerID="aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442" Apr 16 17:13:56.075078 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.075037 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-855cffc847-fzvhl_f13b4cc2-1661-4f97-af96-93ea0d79f1af/main/0.log" Apr 16 17:13:56.075816 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.075791 2572 generic.go:358] "Generic (PLEG): container finished" podID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerID="a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1" exitCode=137 Apr 16 17:13:56.075928 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.075871 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" event={"ID":"f13b4cc2-1661-4f97-af96-93ea0d79f1af","Type":"ContainerDied","Data":"a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1"} Apr 16 17:13:56.075928 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.075906 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" event={"ID":"f13b4cc2-1661-4f97-af96-93ea0d79f1af","Type":"ContainerDied","Data":"f6067c6f9bd3470932f90e0ab22c888384cc1213fd44854b8ac878e07b7ed48a"} Apr 16 17:13:56.076048 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.075932 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl" Apr 16 17:13:56.093344 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.093323 2572 scope.go:117] "RemoveContainer" containerID="bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0" Apr 16 17:13:56.094239 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.094204 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl"] Apr 16 17:13:56.098053 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.098031 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-855cffc847-fzvhl"] Apr 16 17:13:56.108614 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.108595 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz"] Apr 16 17:13:56.115089 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.115047 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-57f6888d64-vq5hz"] Apr 16 17:13:56.119323 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.119307 2572 scope.go:117] "RemoveContainer" containerID="aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442" Apr 16 17:13:56.119639 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:13:56.119593 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442\": container with ID starting with aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442 not found: ID does not exist" containerID="aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442" Apr 16 17:13:56.119740 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.119638 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442"} err="failed to get container status \"aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442\": rpc error: code = NotFound desc = could not find container \"aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442\": container with ID starting with aa2136223c4ccddf7be5dee8f240fae96336530a2f300f1ca4e1a92fc3eac442 not found: ID does not exist" Apr 16 17:13:56.119740 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.119658 2572 scope.go:117] "RemoveContainer" containerID="bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0" Apr 16 17:13:56.119876 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:13:56.119859 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0\": container with ID starting with bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0 not found: ID does not exist" containerID="bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0" Apr 16 17:13:56.119935 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.119882 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0"} err="failed to get container status \"bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0\": rpc error: code = NotFound desc = could not find container \"bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0\": container with ID starting with bd025d9680dbc07a4f89942232558bdcad056a6317b69293b6a06c2fdabd05a0 not found: ID does not exist" Apr 16 17:13:56.119935 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.119899 2572 scope.go:117] "RemoveContainer" containerID="a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1" Apr 16 17:13:56.139453 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.139435 2572 scope.go:117] "RemoveContainer" containerID="ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff" Apr 16 17:13:56.206502 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.206481 2572 scope.go:117] "RemoveContainer" containerID="c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab" Apr 16 17:13:56.213923 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.213904 2572 scope.go:117] "RemoveContainer" containerID="a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1" Apr 16 17:13:56.214195 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:13:56.214173 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1\": container with ID starting with a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1 not found: ID does not exist" containerID="a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1" Apr 16 17:13:56.214280 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.214206 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1"} err="failed to get container status \"a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1\": rpc error: code = NotFound desc = could not find container \"a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1\": container with ID starting with a3109ff6efa1d49a58e1f593716eef087ddde08ade83dc76ffbd7630f76947a1 not found: ID does not exist" Apr 16 17:13:56.214280 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.214234 2572 scope.go:117] "RemoveContainer" containerID="ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff" Apr 16 17:13:56.214505 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:13:56.214485 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff\": container with ID starting with ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff not found: ID does not exist" containerID="ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff" Apr 16 17:13:56.214552 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.214511 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff"} err="failed to get container status \"ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff\": rpc error: code = NotFound desc = could not find container \"ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff\": container with ID starting with ac51caa5272ac57d431b728f12e3ee7f52f87f163884e057d52315fba17c0dff not found: ID does not exist" Apr 16 17:13:56.214552 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.214528 2572 scope.go:117] "RemoveContainer" containerID="c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab" Apr 16 17:13:56.214778 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:13:56.214760 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab\": container with ID starting with c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab not found: ID does not exist" containerID="c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab" Apr 16 17:13:56.214832 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:56.214787 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab"} err="failed to get container status \"c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab\": rpc error: code = NotFound desc = could not find container \"c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab\": container with ID starting with c1b5f684a53bd0e789c3ee1c07c8781a56b7185d00114897e7a11d4e605957ab not found: ID does not exist" Apr 16 17:13:57.591299 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:57.591268 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" path="/var/lib/kubelet/pods/23f8a34b-3c7e-4539-a8b5-43fac58d50e8/volumes" Apr 16 17:13:57.591743 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:13:57.591680 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" path="/var/lib/kubelet/pods/f13b4cc2-1661-4f97-af96-93ea0d79f1af/volumes" Apr 16 17:14:03.236779 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:03.236735 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.65:8000/health\": dial tcp 10.133.0.65:8000: connect: connection refused" Apr 16 17:14:13.235769 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:13.235723 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.65:8000/health\": dial tcp 10.133.0.65:8000: connect: connection refused" Apr 16 17:14:17.083838 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:17.083809 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:14:18.149123 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:18.149086 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn"] Apr 16 17:14:18.149527 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:18.149474 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" podUID="86822240-f0d7-4b1c-9c15-c87383363f81" containerName="main" containerID="cri-o://1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14" gracePeriod=30 Apr 16 17:14:18.149606 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:18.149525 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" podUID="86822240-f0d7-4b1c-9c15-c87383363f81" containerName="tokenizer" containerID="cri-o://f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681" gracePeriod=30 Apr 16 17:14:19.180565 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.180535 2572 generic.go:358] "Generic (PLEG): container finished" podID="86822240-f0d7-4b1c-9c15-c87383363f81" containerID="1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14" exitCode=0 Apr 16 17:14:19.180870 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.180611 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" event={"ID":"86822240-f0d7-4b1c-9c15-c87383363f81","Type":"ContainerDied","Data":"1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14"} Apr 16 17:14:19.389464 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.389443 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:14:19.516221 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.516155 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szlgr\" (UniqueName: \"kubernetes.io/projected/86822240-f0d7-4b1c-9c15-c87383363f81-kube-api-access-szlgr\") pod \"86822240-f0d7-4b1c-9c15-c87383363f81\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " Apr 16 17:14:19.516221 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.516190 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-uds\") pod \"86822240-f0d7-4b1c-9c15-c87383363f81\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " Apr 16 17:14:19.516433 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.516235 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-kserve-provision-location\") pod \"86822240-f0d7-4b1c-9c15-c87383363f81\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " Apr 16 17:14:19.516433 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.516263 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-tmp\") pod \"86822240-f0d7-4b1c-9c15-c87383363f81\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " Apr 16 17:14:19.516433 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.516290 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-cache\") pod \"86822240-f0d7-4b1c-9c15-c87383363f81\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " Apr 16 17:14:19.516433 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.516353 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86822240-f0d7-4b1c-9c15-c87383363f81-tls-certs\") pod \"86822240-f0d7-4b1c-9c15-c87383363f81\" (UID: \"86822240-f0d7-4b1c-9c15-c87383363f81\") " Apr 16 17:14:19.516686 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.516493 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "86822240-f0d7-4b1c-9c15-c87383363f81" (UID: "86822240-f0d7-4b1c-9c15-c87383363f81"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:14:19.516686 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.516551 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "86822240-f0d7-4b1c-9c15-c87383363f81" (UID: "86822240-f0d7-4b1c-9c15-c87383363f81"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:14:19.516686 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.516594 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-uds\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:14:19.516686 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.516598 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "86822240-f0d7-4b1c-9c15-c87383363f81" (UID: "86822240-f0d7-4b1c-9c15-c87383363f81"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:14:19.516899 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.516881 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "86822240-f0d7-4b1c-9c15-c87383363f81" (UID: "86822240-f0d7-4b1c-9c15-c87383363f81"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:14:19.518433 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.518406 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86822240-f0d7-4b1c-9c15-c87383363f81-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "86822240-f0d7-4b1c-9c15-c87383363f81" (UID: "86822240-f0d7-4b1c-9c15-c87383363f81"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:14:19.518767 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.518748 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86822240-f0d7-4b1c-9c15-c87383363f81-kube-api-access-szlgr" (OuterVolumeSpecName: "kube-api-access-szlgr") pod "86822240-f0d7-4b1c-9c15-c87383363f81" (UID: "86822240-f0d7-4b1c-9c15-c87383363f81"). InnerVolumeSpecName "kube-api-access-szlgr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:14:19.617085 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.617052 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:14:19.617181 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.617087 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-tmp\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:14:19.617181 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.617098 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/86822240-f0d7-4b1c-9c15-c87383363f81-tokenizer-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:14:19.617181 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.617107 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86822240-f0d7-4b1c-9c15-c87383363f81-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:14:19.617181 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:19.617116 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-szlgr\" (UniqueName: \"kubernetes.io/projected/86822240-f0d7-4b1c-9c15-c87383363f81-kube-api-access-szlgr\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:14:20.186928 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.186896 2572 generic.go:358] "Generic (PLEG): container finished" podID="86822240-f0d7-4b1c-9c15-c87383363f81" containerID="f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681" exitCode=0 Apr 16 17:14:20.187412 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.186984 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" event={"ID":"86822240-f0d7-4b1c-9c15-c87383363f81","Type":"ContainerDied","Data":"f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681"} Apr 16 17:14:20.187412 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.186994 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" Apr 16 17:14:20.187412 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.187034 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn" event={"ID":"86822240-f0d7-4b1c-9c15-c87383363f81","Type":"ContainerDied","Data":"3811ecc0712fb0404d66016618c55f49b874f5d9521249ef3f1d2c156c3a4a72"} Apr 16 17:14:20.187412 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.187058 2572 scope.go:117] "RemoveContainer" containerID="f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681" Apr 16 17:14:20.197864 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.197849 2572 scope.go:117] "RemoveContainer" containerID="1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14" Apr 16 17:14:20.205631 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.205610 2572 scope.go:117] "RemoveContainer" containerID="d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc" Apr 16 17:14:20.207297 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.207248 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn"] Apr 16 17:14:20.209567 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.209546 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-99db9zx4rn"] Apr 16 17:14:20.213971 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.213954 2572 scope.go:117] "RemoveContainer" containerID="f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681" Apr 16 17:14:20.214240 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:14:20.214224 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681\": container with ID starting with f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681 not found: ID does not exist" containerID="f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681" Apr 16 17:14:20.214295 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.214250 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681"} err="failed to get container status \"f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681\": rpc error: code = NotFound desc = could not find container \"f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681\": container with ID starting with f7965fd2754f3fed9f492b922d76672b2c5d466e45ed42f94383c2d95dbe2681 not found: ID does not exist" Apr 16 17:14:20.214295 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.214268 2572 scope.go:117] "RemoveContainer" containerID="1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14" Apr 16 17:14:20.214510 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:14:20.214496 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14\": container with ID starting with 1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14 not found: ID does not exist" containerID="1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14" Apr 16 17:14:20.214554 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.214514 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14"} err="failed to get container status \"1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14\": rpc error: code = NotFound desc = could not find container \"1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14\": container with ID starting with 1f22dfd392acb724634634a8cad1548ebf9b783a17860209ff0f543f585fcc14 not found: ID does not exist" Apr 16 17:14:20.214554 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.214526 2572 scope.go:117] "RemoveContainer" containerID="d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc" Apr 16 17:14:20.214748 ip-10-0-137-126 kubenswrapper[2572]: E0416 17:14:20.214733 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc\": container with ID starting with d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc not found: ID does not exist" containerID="d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc" Apr 16 17:14:20.214791 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:20.214750 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc"} err="failed to get container status \"d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc\": rpc error: code = NotFound desc = could not find container \"d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc\": container with ID starting with d678c07d91571f790df3b6db30389d63e8cf72c097fa0c36eab9b7967ffe82cc not found: ID does not exist" Apr 16 17:14:21.590525 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:21.590485 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86822240-f0d7-4b1c-9c15-c87383363f81" path="/var/lib/kubelet/pods/86822240-f0d7-4b1c-9c15-c87383363f81/volumes" Apr 16 17:14:23.236820 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:23.236776 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.65:8000/health\": dial tcp 10.133.0.65:8000: connect: connection refused" Apr 16 17:14:33.236225 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:33.236174 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.65:8000/health\": dial tcp 10.133.0.65:8000: connect: connection refused" Apr 16 17:14:43.235865 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:43.235811 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.65:8000/health\": dial tcp 10.133.0.65:8000: connect: connection refused" Apr 16 17:14:53.236655 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:14:53.236614 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.65:8000/health\": dial tcp 10.133.0.65:8000: connect: connection refused" Apr 16 17:15:03.236687 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:03.236649 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.65:8000/health\": dial tcp 10.133.0.65:8000: connect: connection refused" Apr 16 17:15:13.245756 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:13.245717 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:15:13.253560 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:13.253541 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:15:24.415321 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:24.415286 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld"] Apr 16 17:15:24.415810 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:24.415577 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" containerID="cri-o://9ad34cdf99b91747c501ab1db0908eb84581cfb0963fdd02796b04794da8d8a6" gracePeriod=30 Apr 16 17:15:25.566563 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566527 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hh5rr/must-gather-xvn4h"] Apr 16 17:15:25.566920 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566894 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="storage-initializer" Apr 16 17:15:25.566920 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566905 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="storage-initializer" Apr 16 17:15:25.566920 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566913 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="storage-initializer" Apr 16 17:15:25.566920 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566918 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="storage-initializer" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566928 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566934 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566944 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="storage-initializer" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566949 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="storage-initializer" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566957 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566962 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566967 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566972 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566985 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86822240-f0d7-4b1c-9c15-c87383363f81" containerName="storage-initializer" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566991 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="86822240-f0d7-4b1c-9c15-c87383363f81" containerName="storage-initializer" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.566998 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86822240-f0d7-4b1c-9c15-c87383363f81" containerName="main" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.567002 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="86822240-f0d7-4b1c-9c15-c87383363f81" containerName="main" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.567007 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86822240-f0d7-4b1c-9c15-c87383363f81" containerName="tokenizer" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.567012 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="86822240-f0d7-4b1c-9c15-c87383363f81" containerName="tokenizer" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.567023 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="llm-d-routing-sidecar" Apr 16 17:15:25.567050 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.567028 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="llm-d-routing-sidecar" Apr 16 17:15:25.567547 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.567088 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="llm-d-routing-sidecar" Apr 16 17:15:25.567547 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.567098 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f13b4cc2-1661-4f97-af96-93ea0d79f1af" containerName="main" Apr 16 17:15:25.567547 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.567104 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="86822240-f0d7-4b1c-9c15-c87383363f81" containerName="tokenizer" Apr 16 17:15:25.567547 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.567111 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbcd0ec2-e2ae-417c-aba4-086eb9fba102" containerName="main" Apr 16 17:15:25.567547 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.567119 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="23f8a34b-3c7e-4539-a8b5-43fac58d50e8" containerName="main" Apr 16 17:15:25.567547 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.567124 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="86822240-f0d7-4b1c-9c15-c87383363f81" containerName="main" Apr 16 17:15:25.569989 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.569969 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" Apr 16 17:15:25.572867 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.572845 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hh5rr\"/\"openshift-service-ca.crt\"" Apr 16 17:15:25.574080 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.574046 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hh5rr\"/\"kube-root-ca.crt\"" Apr 16 17:15:25.574191 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.574115 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hh5rr\"/\"default-dockercfg-7ffvz\"" Apr 16 17:15:25.577895 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.577874 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hh5rr/must-gather-xvn4h"] Apr 16 17:15:25.656718 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.656691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bkqm\" (UniqueName: \"kubernetes.io/projected/c7d854cf-e575-4d81-99a8-1b2c5db48239-kube-api-access-4bkqm\") pod \"must-gather-xvn4h\" (UID: \"c7d854cf-e575-4d81-99a8-1b2c5db48239\") " pod="openshift-must-gather-hh5rr/must-gather-xvn4h" Apr 16 17:15:25.656843 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.656782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7d854cf-e575-4d81-99a8-1b2c5db48239-must-gather-output\") pod \"must-gather-xvn4h\" (UID: \"c7d854cf-e575-4d81-99a8-1b2c5db48239\") " pod="openshift-must-gather-hh5rr/must-gather-xvn4h" Apr 16 17:15:25.757745 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.757716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bkqm\" (UniqueName: \"kubernetes.io/projected/c7d854cf-e575-4d81-99a8-1b2c5db48239-kube-api-access-4bkqm\") pod \"must-gather-xvn4h\" (UID: \"c7d854cf-e575-4d81-99a8-1b2c5db48239\") " pod="openshift-must-gather-hh5rr/must-gather-xvn4h" Apr 16 17:15:25.757873 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.757782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7d854cf-e575-4d81-99a8-1b2c5db48239-must-gather-output\") pod \"must-gather-xvn4h\" (UID: \"c7d854cf-e575-4d81-99a8-1b2c5db48239\") " pod="openshift-must-gather-hh5rr/must-gather-xvn4h" Apr 16 17:15:25.758042 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.758024 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7d854cf-e575-4d81-99a8-1b2c5db48239-must-gather-output\") pod \"must-gather-xvn4h\" (UID: \"c7d854cf-e575-4d81-99a8-1b2c5db48239\") " pod="openshift-must-gather-hh5rr/must-gather-xvn4h" Apr 16 17:15:25.766242 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.766223 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bkqm\" (UniqueName: \"kubernetes.io/projected/c7d854cf-e575-4d81-99a8-1b2c5db48239-kube-api-access-4bkqm\") pod \"must-gather-xvn4h\" (UID: \"c7d854cf-e575-4d81-99a8-1b2c5db48239\") " pod="openshift-must-gather-hh5rr/must-gather-xvn4h" Apr 16 17:15:25.880227 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:25.880206 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" Apr 16 17:15:26.203963 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:26.203942 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hh5rr/must-gather-xvn4h"] Apr 16 17:15:26.205740 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:15:26.205711 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7d854cf_e575_4d81_99a8_1b2c5db48239.slice/crio-bfca8686aee75a9c02c2d6be8c81f28cae1140419803bdb3b90a5713435713ed WatchSource:0}: Error finding container bfca8686aee75a9c02c2d6be8c81f28cae1140419803bdb3b90a5713435713ed: Status 404 returned error can't find the container with id bfca8686aee75a9c02c2d6be8c81f28cae1140419803bdb3b90a5713435713ed Apr 16 17:15:26.207353 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:26.207335 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:15:26.438585 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:26.438545 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" event={"ID":"c7d854cf-e575-4d81-99a8-1b2c5db48239","Type":"ContainerStarted","Data":"bfca8686aee75a9c02c2d6be8c81f28cae1140419803bdb3b90a5713435713ed"} Apr 16 17:15:31.462322 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:31.462283 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" event={"ID":"c7d854cf-e575-4d81-99a8-1b2c5db48239","Type":"ContainerStarted","Data":"fce27a797a94ca23262639aba0f78d4f178d3db2cd2f0f3c3c129376be7a3552"} Apr 16 17:15:31.462322 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:31.462324 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" event={"ID":"c7d854cf-e575-4d81-99a8-1b2c5db48239","Type":"ContainerStarted","Data":"62664eb4355512a3f30564a30fd82c10f3a32e91ad97bd2163b7dbf0bbafb6f5"} Apr 16 17:15:31.479814 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:31.479766 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" podStartSLOduration=2.3102651610000002 podStartE2EDuration="6.479752734s" podCreationTimestamp="2026-04-16 17:15:25 +0000 UTC" firstStartedPulling="2026-04-16 17:15:26.207451829 +0000 UTC m=+1643.376163170" lastFinishedPulling="2026-04-16 17:15:30.376939402 +0000 UTC m=+1647.545650743" observedRunningTime="2026-04-16 17:15:31.477019609 +0000 UTC m=+1648.645730985" watchObservedRunningTime="2026-04-16 17:15:31.479752734 +0000 UTC m=+1648.648464096" Apr 16 17:15:39.655452 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:39.655424 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:39.703352 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:39.703322 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:40.675625 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:40.675598 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:40.685349 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:40.685330 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:41.688021 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:41.687990 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:41.695473 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:41.695450 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:42.677746 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:42.677719 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:42.685113 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:42.685092 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:43.644353 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:43.644319 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:43.655790 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:43.655741 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:44.605912 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:44.605883 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:44.613049 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:44.613026 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:45.602401 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:45.602365 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:45.610129 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:45.610105 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:46.583732 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:46.583692 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:46.592270 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:46.592234 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:47.612414 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:47.612379 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:47.619702 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:47.619682 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:48.643586 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:48.643558 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:48.651471 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:48.651447 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:49.625778 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:49.625749 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:49.632452 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:49.632428 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:50.609447 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:50.609416 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:50.616750 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:50.616717 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:51.589620 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:51.589592 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:51.596625 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:51.596601 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:52.571350 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:52.571321 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/main/0.log" Apr 16 17:15:52.578805 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:52.578783 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld_75cdbd21-24ab-4542-9601-3840e16e313d/storage-initializer/0.log" Apr 16 17:15:53.516961 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:53.516922 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-pm5bn_c11ab2b7-a15d-45b9-95e3-690208f5d272/istio-proxy/0.log" Apr 16 17:15:54.335866 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.335830 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-pm5bn_c11ab2b7-a15d-45b9-95e3-690208f5d272/istio-proxy/0.log" Apr 16 17:15:54.556668 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.556637 2572 generic.go:358] "Generic (PLEG): container finished" podID="75cdbd21-24ab-4542-9601-3840e16e313d" containerID="9ad34cdf99b91747c501ab1db0908eb84581cfb0963fdd02796b04794da8d8a6" exitCode=137 Apr 16 17:15:54.556811 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.556679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" event={"ID":"75cdbd21-24ab-4542-9601-3840e16e313d","Type":"ContainerDied","Data":"9ad34cdf99b91747c501ab1db0908eb84581cfb0963fdd02796b04794da8d8a6"} Apr 16 17:15:54.679689 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.679666 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:15:54.709948 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.709921 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-kserve-provision-location\") pod \"75cdbd21-24ab-4542-9601-3840e16e313d\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " Apr 16 17:15:54.710141 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.709994 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-dshm\") pod \"75cdbd21-24ab-4542-9601-3840e16e313d\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " Apr 16 17:15:54.710141 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.710020 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-home\") pod \"75cdbd21-24ab-4542-9601-3840e16e313d\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " Apr 16 17:15:54.710141 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.710057 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlv8t\" (UniqueName: \"kubernetes.io/projected/75cdbd21-24ab-4542-9601-3840e16e313d-kube-api-access-tlv8t\") pod \"75cdbd21-24ab-4542-9601-3840e16e313d\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " Apr 16 17:15:54.710141 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.710102 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75cdbd21-24ab-4542-9601-3840e16e313d-tls-certs\") pod \"75cdbd21-24ab-4542-9601-3840e16e313d\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " Apr 16 17:15:54.710141 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.710123 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-model-cache\") pod \"75cdbd21-24ab-4542-9601-3840e16e313d\" (UID: \"75cdbd21-24ab-4542-9601-3840e16e313d\") " Apr 16 17:15:54.710623 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.710594 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-model-cache" (OuterVolumeSpecName: "model-cache") pod "75cdbd21-24ab-4542-9601-3840e16e313d" (UID: "75cdbd21-24ab-4542-9601-3840e16e313d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:54.710762 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.710711 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-home" (OuterVolumeSpecName: "home") pod "75cdbd21-24ab-4542-9601-3840e16e313d" (UID: "75cdbd21-24ab-4542-9601-3840e16e313d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:54.713701 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.713674 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-dshm" (OuterVolumeSpecName: "dshm") pod "75cdbd21-24ab-4542-9601-3840e16e313d" (UID: "75cdbd21-24ab-4542-9601-3840e16e313d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:54.713830 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.713809 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75cdbd21-24ab-4542-9601-3840e16e313d-kube-api-access-tlv8t" (OuterVolumeSpecName: "kube-api-access-tlv8t") pod "75cdbd21-24ab-4542-9601-3840e16e313d" (UID: "75cdbd21-24ab-4542-9601-3840e16e313d"). InnerVolumeSpecName "kube-api-access-tlv8t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:15:54.714414 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.714393 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75cdbd21-24ab-4542-9601-3840e16e313d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "75cdbd21-24ab-4542-9601-3840e16e313d" (UID: "75cdbd21-24ab-4542-9601-3840e16e313d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:15:54.789957 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.789907 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "75cdbd21-24ab-4542-9601-3840e16e313d" (UID: "75cdbd21-24ab-4542-9601-3840e16e313d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:54.811120 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.811094 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-kserve-provision-location\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:15:54.811120 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.811118 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-dshm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:15:54.811120 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.811128 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-home\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:15:54.811331 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.811137 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tlv8t\" (UniqueName: \"kubernetes.io/projected/75cdbd21-24ab-4542-9601-3840e16e313d-kube-api-access-tlv8t\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:15:54.811331 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.811147 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75cdbd21-24ab-4542-9601-3840e16e313d-tls-certs\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:15:54.811331 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:54.811155 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/75cdbd21-24ab-4542-9601-3840e16e313d-model-cache\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:15:55.160432 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:55.160396 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-qrvbs_4ab47623-d87a-4da7-bd7b-aa5916314031/manager/0.log" Apr 16 17:15:55.177727 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:55.177698 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-6pk8c_97b063e2-ae46-489b-b95c-bfa8856292f0/manager/0.log" Apr 16 17:15:55.222780 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:55.222744 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-jn6hg_b37eedbd-8438-4d03-a544-50830b57acf4/manager/0.log" Apr 16 17:15:55.230852 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:55.230833 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-lbws6_fab44b43-0ba8-4c44-8c2f-1cfc3e92a166/limitador/0.log" Apr 16 17:15:55.562742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:55.562660 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" event={"ID":"75cdbd21-24ab-4542-9601-3840e16e313d","Type":"ContainerDied","Data":"1ae17f584d0cdae1a7a1152908bc373ac7400f2fbb3d5853009e5ac1b33f5335"} Apr 16 17:15:55.562742 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:55.562704 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld" Apr 16 17:15:55.563267 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:55.562707 2572 scope.go:117] "RemoveContainer" containerID="9ad34cdf99b91747c501ab1db0908eb84581cfb0963fdd02796b04794da8d8a6" Apr 16 17:15:55.585959 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:55.585933 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld"] Apr 16 17:15:55.591351 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:55.591323 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64c46b7dfb-d9lld"] Apr 16 17:15:55.591541 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:55.591507 2572 scope.go:117] "RemoveContainer" containerID="6a0f721a682e3f3a1bea400b4ee5ae0a5cc3e7aa380f509a28525be18d5a8998" Apr 16 17:15:56.569122 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:56.569084 2572 generic.go:358] "Generic (PLEG): container finished" podID="c7d854cf-e575-4d81-99a8-1b2c5db48239" containerID="62664eb4355512a3f30564a30fd82c10f3a32e91ad97bd2163b7dbf0bbafb6f5" exitCode=0 Apr 16 17:15:56.569122 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:56.569089 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" event={"ID":"c7d854cf-e575-4d81-99a8-1b2c5db48239","Type":"ContainerDied","Data":"62664eb4355512a3f30564a30fd82c10f3a32e91ad97bd2163b7dbf0bbafb6f5"} Apr 16 17:15:56.569592 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:56.569423 2572 scope.go:117] "RemoveContainer" containerID="62664eb4355512a3f30564a30fd82c10f3a32e91ad97bd2163b7dbf0bbafb6f5" Apr 16 17:15:56.969125 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:56.969094 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hh5rr_must-gather-xvn4h_c7d854cf-e575-4d81-99a8-1b2c5db48239/gather/0.log" Apr 16 17:15:57.590367 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.590336 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" path="/var/lib/kubelet/pods/75cdbd21-24ab-4542-9601-3840e16e313d/volumes" Apr 16 17:15:57.617273 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.617242 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-msdj2/must-gather-dwjcb"] Apr 16 17:15:57.617598 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.617586 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" Apr 16 17:15:57.617641 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.617600 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" Apr 16 17:15:57.617641 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.617617 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="storage-initializer" Apr 16 17:15:57.617641 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.617625 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="storage-initializer" Apr 16 17:15:57.617731 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.617691 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="75cdbd21-24ab-4542-9601-3840e16e313d" containerName="main" Apr 16 17:15:57.622000 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.621982 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-msdj2/must-gather-dwjcb" Apr 16 17:15:57.624772 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.624748 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-msdj2\"/\"kube-root-ca.crt\"" Apr 16 17:15:57.624861 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.624770 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-msdj2\"/\"default-dockercfg-4rjqv\"" Apr 16 17:15:57.624861 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.624789 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-msdj2\"/\"openshift-service-ca.crt\"" Apr 16 17:15:57.631163 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.631140 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-msdj2/must-gather-dwjcb"] Apr 16 17:15:57.736146 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.736112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk8lf\" (UniqueName: \"kubernetes.io/projected/9ebd8fd2-77bb-43b9-a138-c3dbb53f5521-kube-api-access-pk8lf\") pod \"must-gather-dwjcb\" (UID: \"9ebd8fd2-77bb-43b9-a138-c3dbb53f5521\") " pod="openshift-must-gather-msdj2/must-gather-dwjcb" Apr 16 17:15:57.736278 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.736191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ebd8fd2-77bb-43b9-a138-c3dbb53f5521-must-gather-output\") pod \"must-gather-dwjcb\" (UID: \"9ebd8fd2-77bb-43b9-a138-c3dbb53f5521\") " pod="openshift-must-gather-msdj2/must-gather-dwjcb" Apr 16 17:15:57.837577 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.837546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk8lf\" (UniqueName: \"kubernetes.io/projected/9ebd8fd2-77bb-43b9-a138-c3dbb53f5521-kube-api-access-pk8lf\") pod \"must-gather-dwjcb\" (UID: \"9ebd8fd2-77bb-43b9-a138-c3dbb53f5521\") " pod="openshift-must-gather-msdj2/must-gather-dwjcb" Apr 16 17:15:57.837707 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.837589 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ebd8fd2-77bb-43b9-a138-c3dbb53f5521-must-gather-output\") pod \"must-gather-dwjcb\" (UID: \"9ebd8fd2-77bb-43b9-a138-c3dbb53f5521\") " pod="openshift-must-gather-msdj2/must-gather-dwjcb" Apr 16 17:15:57.837846 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.837830 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ebd8fd2-77bb-43b9-a138-c3dbb53f5521-must-gather-output\") pod \"must-gather-dwjcb\" (UID: \"9ebd8fd2-77bb-43b9-a138-c3dbb53f5521\") " pod="openshift-must-gather-msdj2/must-gather-dwjcb" Apr 16 17:15:57.847565 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.847509 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk8lf\" (UniqueName: \"kubernetes.io/projected/9ebd8fd2-77bb-43b9-a138-c3dbb53f5521-kube-api-access-pk8lf\") pod \"must-gather-dwjcb\" (UID: \"9ebd8fd2-77bb-43b9-a138-c3dbb53f5521\") " pod="openshift-must-gather-msdj2/must-gather-dwjcb" Apr 16 17:15:57.931532 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:57.931509 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-msdj2/must-gather-dwjcb" Apr 16 17:15:58.048765 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:58.048738 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-msdj2/must-gather-dwjcb"] Apr 16 17:15:58.050687 ip-10-0-137-126 kubenswrapper[2572]: W0416 17:15:58.050658 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ebd8fd2_77bb_43b9_a138_c3dbb53f5521.slice/crio-ebf23737c1cafd442bdcf35da224cbf6dd37dcbb7ad5b23b3f1a275b503240be WatchSource:0}: Error finding container ebf23737c1cafd442bdcf35da224cbf6dd37dcbb7ad5b23b3f1a275b503240be: Status 404 returned error can't find the container with id ebf23737c1cafd442bdcf35da224cbf6dd37dcbb7ad5b23b3f1a275b503240be Apr 16 17:15:58.577911 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:58.577879 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-msdj2/must-gather-dwjcb" event={"ID":"9ebd8fd2-77bb-43b9-a138-c3dbb53f5521","Type":"ContainerStarted","Data":"ebf23737c1cafd442bdcf35da224cbf6dd37dcbb7ad5b23b3f1a275b503240be"} Apr 16 17:15:59.584876 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:59.584829 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-msdj2/must-gather-dwjcb" event={"ID":"9ebd8fd2-77bb-43b9-a138-c3dbb53f5521","Type":"ContainerStarted","Data":"8782047eeebafab82ae4920432ce91522d09c580dedf089432be00696c8fb50d"} Apr 16 17:15:59.584876 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:59.584880 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-msdj2/must-gather-dwjcb" event={"ID":"9ebd8fd2-77bb-43b9-a138-c3dbb53f5521","Type":"ContainerStarted","Data":"72fb6b96999f0c04b8f845e66c7d59119cc43cdfd80b4195710a8e81ddd332ae"} Apr 16 17:15:59.603839 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:15:59.603774 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-msdj2/must-gather-dwjcb" podStartSLOduration=1.886503567 podStartE2EDuration="2.603757443s" podCreationTimestamp="2026-04-16 17:15:57 +0000 UTC" firstStartedPulling="2026-04-16 17:15:58.052373111 +0000 UTC m=+1675.221084453" lastFinishedPulling="2026-04-16 17:15:58.76962698 +0000 UTC m=+1675.938338329" observedRunningTime="2026-04-16 17:15:59.59974634 +0000 UTC m=+1676.768457703" watchObservedRunningTime="2026-04-16 17:15:59.603757443 +0000 UTC m=+1676.772468807" Apr 16 17:16:00.423216 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:00.423184 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nnbtw_35f5a8e4-d220-4bcd-bbc1-7031d2c0ad08/global-pull-secret-syncer/0.log" Apr 16 17:16:00.457466 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:00.457427 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4chxn_b78e00dd-6abc-4e46-83bf-28cd51e87cc9/konnectivity-agent/0.log" Apr 16 17:16:00.539602 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:00.539576 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-126.ec2.internal_a676821af174364e03710748c5f10fbf/haproxy/0.log" Apr 16 17:16:02.486833 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.485981 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hh5rr/must-gather-xvn4h"] Apr 16 17:16:02.486833 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.486283 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" podUID="c7d854cf-e575-4d81-99a8-1b2c5db48239" containerName="copy" containerID="cri-o://fce27a797a94ca23262639aba0f78d4f178d3db2cd2f0f3c3c129376be7a3552" gracePeriod=2 Apr 16 17:16:02.488694 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.488641 2572 status_manager.go:895] "Failed to get status for pod" podUID="c7d854cf-e575-4d81-99a8-1b2c5db48239" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" err="pods \"must-gather-xvn4h\" is forbidden: User \"system:node:ip-10-0-137-126.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hh5rr\": no relationship found between node 'ip-10-0-137-126.ec2.internal' and this object" Apr 16 17:16:02.489333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.489308 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hh5rr/must-gather-xvn4h"] Apr 16 17:16:02.609801 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.609778 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hh5rr_must-gather-xvn4h_c7d854cf-e575-4d81-99a8-1b2c5db48239/copy/0.log" Apr 16 17:16:02.610509 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.610323 2572 generic.go:358] "Generic (PLEG): container finished" podID="c7d854cf-e575-4d81-99a8-1b2c5db48239" containerID="fce27a797a94ca23262639aba0f78d4f178d3db2cd2f0f3c3c129376be7a3552" exitCode=143 Apr 16 17:16:02.841094 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.840767 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hh5rr_must-gather-xvn4h_c7d854cf-e575-4d81-99a8-1b2c5db48239/copy/0.log" Apr 16 17:16:02.841563 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.841298 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" Apr 16 17:16:02.843735 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.843675 2572 status_manager.go:895] "Failed to get status for pod" podUID="c7d854cf-e575-4d81-99a8-1b2c5db48239" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" err="pods \"must-gather-xvn4h\" is forbidden: User \"system:node:ip-10-0-137-126.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hh5rr\": no relationship found between node 'ip-10-0-137-126.ec2.internal' and this object" Apr 16 17:16:02.894055 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.887321 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bkqm\" (UniqueName: \"kubernetes.io/projected/c7d854cf-e575-4d81-99a8-1b2c5db48239-kube-api-access-4bkqm\") pod \"c7d854cf-e575-4d81-99a8-1b2c5db48239\" (UID: \"c7d854cf-e575-4d81-99a8-1b2c5db48239\") " Apr 16 17:16:02.894055 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.887391 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7d854cf-e575-4d81-99a8-1b2c5db48239-must-gather-output\") pod \"c7d854cf-e575-4d81-99a8-1b2c5db48239\" (UID: \"c7d854cf-e575-4d81-99a8-1b2c5db48239\") " Apr 16 17:16:02.896885 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.896853 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d854cf-e575-4d81-99a8-1b2c5db48239-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c7d854cf-e575-4d81-99a8-1b2c5db48239" (UID: "c7d854cf-e575-4d81-99a8-1b2c5db48239"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:16:02.898679 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.898631 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d854cf-e575-4d81-99a8-1b2c5db48239-kube-api-access-4bkqm" (OuterVolumeSpecName: "kube-api-access-4bkqm") pod "c7d854cf-e575-4d81-99a8-1b2c5db48239" (UID: "c7d854cf-e575-4d81-99a8-1b2c5db48239"). InnerVolumeSpecName "kube-api-access-4bkqm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:16:02.988390 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.988320 2572 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7d854cf-e575-4d81-99a8-1b2c5db48239-must-gather-output\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:16:02.988390 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:02.988359 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4bkqm\" (UniqueName: \"kubernetes.io/projected/c7d854cf-e575-4d81-99a8-1b2c5db48239-kube-api-access-4bkqm\") on node \"ip-10-0-137-126.ec2.internal\" DevicePath \"\"" Apr 16 17:16:03.597785 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:03.597731 2572 status_manager.go:895] "Failed to get status for pod" podUID="c7d854cf-e575-4d81-99a8-1b2c5db48239" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" err="pods \"must-gather-xvn4h\" is forbidden: User \"system:node:ip-10-0-137-126.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hh5rr\": no relationship found between node 'ip-10-0-137-126.ec2.internal' and this object" Apr 16 17:16:03.605258 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:03.605223 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d854cf-e575-4d81-99a8-1b2c5db48239" path="/var/lib/kubelet/pods/c7d854cf-e575-4d81-99a8-1b2c5db48239/volumes" Apr 16 17:16:03.616298 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:03.616232 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hh5rr_must-gather-xvn4h_c7d854cf-e575-4d81-99a8-1b2c5db48239/copy/0.log" Apr 16 17:16:03.620079 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:03.616852 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hh5rr/must-gather-xvn4h" Apr 16 17:16:03.620079 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:03.616946 2572 scope.go:117] "RemoveContainer" containerID="fce27a797a94ca23262639aba0f78d4f178d3db2cd2f0f3c3c129376be7a3552" Apr 16 17:16:03.632797 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:03.632695 2572 scope.go:117] "RemoveContainer" containerID="62664eb4355512a3f30564a30fd82c10f3a32e91ad97bd2163b7dbf0bbafb6f5" Apr 16 17:16:04.607003 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:04.606923 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-qrvbs_4ab47623-d87a-4da7-bd7b-aa5916314031/manager/0.log" Apr 16 17:16:04.637365 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:04.637290 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-6pk8c_97b063e2-ae46-489b-b95c-bfa8856292f0/manager/0.log" Apr 16 17:16:04.709084 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:04.709043 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-jn6hg_b37eedbd-8438-4d03-a544-50830b57acf4/manager/0.log" Apr 16 17:16:04.729761 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:04.729736 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-lbws6_fab44b43-0ba8-4c44-8c2f-1cfc3e92a166/limitador/0.log" Apr 16 17:16:06.031581 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:06.031549 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dw57d_a2eb8640-dd12-4e24-a36c-be01ef52908a/node-exporter/0.log" Apr 16 17:16:06.050752 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:06.050721 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dw57d_a2eb8640-dd12-4e24-a36c-be01ef52908a/kube-rbac-proxy/0.log" Apr 16 17:16:06.070304 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:06.070273 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dw57d_a2eb8640-dd12-4e24-a36c-be01ef52908a/init-textfile/0.log" Apr 16 17:16:06.393237 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:06.393196 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-7t7n4_ac362f0d-aa0b-4b45-bcdd-4549f444fd35/prometheus-operator/0.log" Apr 16 17:16:06.419014 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:06.418968 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-7t7n4_ac362f0d-aa0b-4b45-bcdd-4549f444fd35/kube-rbac-proxy/0.log" Apr 16 17:16:06.446616 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:06.446589 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-h2crd_0f02da32-cecf-4794-98ac-cfe18237c3e1/prometheus-operator-admission-webhook/0.log" Apr 16 17:16:08.816925 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:08.816895 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-rgfgm_20b249ed-1937-4b5b-b328-8a3db9d456fc/download-server/0.log" Apr 16 17:16:09.224081 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.224025 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8"] Apr 16 17:16:09.224665 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.224640 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7d854cf-e575-4d81-99a8-1b2c5db48239" containerName="gather" Apr 16 17:16:09.224665 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.224667 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d854cf-e575-4d81-99a8-1b2c5db48239" containerName="gather" Apr 16 17:16:09.224846 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.224690 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7d854cf-e575-4d81-99a8-1b2c5db48239" containerName="copy" Apr 16 17:16:09.224846 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.224700 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d854cf-e575-4d81-99a8-1b2c5db48239" containerName="copy" Apr 16 17:16:09.224846 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.224802 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7d854cf-e575-4d81-99a8-1b2c5db48239" containerName="copy" Apr 16 17:16:09.224846 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.224816 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7d854cf-e575-4d81-99a8-1b2c5db48239" containerName="gather" Apr 16 17:16:09.231352 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.231327 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.233333 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.233307 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8"] Apr 16 17:16:09.350117 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.350086 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-proc\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.350298 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.350171 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mr4\" (UniqueName: \"kubernetes.io/projected/7fd44d6a-adf4-4c53-a1d6-651e98fead34-kube-api-access-b4mr4\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.350298 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.350196 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-lib-modules\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.350298 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.350213 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-sys\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.350298 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.350240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-podres\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.450765 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.450726 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mr4\" (UniqueName: \"kubernetes.io/projected/7fd44d6a-adf4-4c53-a1d6-651e98fead34-kube-api-access-b4mr4\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.450947 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.450778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-lib-modules\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.450947 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.450823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-sys\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.450947 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.450870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-podres\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.450947 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.450893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-lib-modules\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.450947 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.450903 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-proc\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.450947 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.450923 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-sys\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.450947 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.450941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-proc\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.451294 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.451026 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7fd44d6a-adf4-4c53-a1d6-651e98fead34-podres\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.459985 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.459952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mr4\" (UniqueName: \"kubernetes.io/projected/7fd44d6a-adf4-4c53-a1d6-651e98fead34-kube-api-access-b4mr4\") pod \"perf-node-gather-daemonset-gcfs8\" (UID: \"7fd44d6a-adf4-4c53-a1d6-651e98fead34\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.543186 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.543111 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:09.691501 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:09.691476 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8"] Apr 16 17:16:10.038664 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:10.038633 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k7p77_74b52111-9c5e-4b37-ab68-e34630312fcb/dns/0.log" Apr 16 17:16:10.056726 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:10.056702 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k7p77_74b52111-9c5e-4b37-ab68-e34630312fcb/kube-rbac-proxy/0.log" Apr 16 17:16:10.158297 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:10.158268 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p5bf5_b3dfeb2f-a4ab-4fe5-ab2c-6c8dacead269/dns-node-resolver/0.log" Apr 16 17:16:10.601054 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:10.601025 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5b8c7794b5-ntls8_12ccdd0e-4d6f-4501-bcff-5d8a87aa8417/registry/0.log" Apr 16 17:16:10.638074 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:10.638033 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9tnfz_caaa1d46-b551-4960-b546-994b0ee36fed/node-ca/0.log" Apr 16 17:16:10.657120 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:10.657090 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" event={"ID":"7fd44d6a-adf4-4c53-a1d6-651e98fead34","Type":"ContainerStarted","Data":"eab78d5b277d1aec87434f4edbcf31c3423e4786547d0d1328dc5326362d60e8"} Apr 16 17:16:10.657120 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:10.657124 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" event={"ID":"7fd44d6a-adf4-4c53-a1d6-651e98fead34","Type":"ContainerStarted","Data":"8f97c9fd75f618b8c33f0e15ab7a1491bf37190ff409c87fbff021eaba1bf560"} Apr 16 17:16:10.657298 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:10.657239 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:10.675130 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:10.675052 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" podStartSLOduration=1.6750360359999998 podStartE2EDuration="1.675036036s" podCreationTimestamp="2026-04-16 17:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:16:10.671637273 +0000 UTC m=+1687.840348633" watchObservedRunningTime="2026-04-16 17:16:10.675036036 +0000 UTC m=+1687.843747400" Apr 16 17:16:11.456807 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:11.456779 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-pm5bn_c11ab2b7-a15d-45b9-95e3-690208f5d272/istio-proxy/0.log" Apr 16 17:16:11.919918 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:11.919882 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bdzw7_535dfd9c-5e07-4e18-886d-57be1138629f/serve-healthcheck-canary/0.log" Apr 16 17:16:12.391844 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:12.391820 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-p6ljq_18cb8262-52e8-4ba6-87cb-9349969eed30/kube-rbac-proxy/0.log" Apr 16 17:16:12.417989 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:12.417961 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-p6ljq_18cb8262-52e8-4ba6-87cb-9349969eed30/exporter/0.log" Apr 16 17:16:12.435633 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:12.435611 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-p6ljq_18cb8262-52e8-4ba6-87cb-9349969eed30/extractor/0.log" Apr 16 17:16:14.997258 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:14.997211 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-dn95d_148930b9-33c3-45fb-9cce-7707e5682e08/openshift-lws-operator/0.log" Apr 16 17:16:15.495993 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:15.495966 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6986579898-n5mnl_32c8d391-524f-4885-afb2-f374af1adf47/manager/0.log" Apr 16 17:16:15.556467 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:15.556443 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-5qnd9_11703ac7-7f53-4289-9ee4-19d11376828a/server/0.log" Apr 16 17:16:15.819826 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:15.819742 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-7cvhc_adbd0b4b-af77-4b37-91f2-e1b99377e319/manager/0.log" Apr 16 17:16:15.838427 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:15.838399 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-9vjtv_7985ba5d-5777-4ab1-9f78-7878d9cbf8f2/s3-init/0.log" Apr 16 17:16:15.865677 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:15.865655 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-d9spr_ecc87e25-944f-4abf-9995-fc8d7e8252ac/seaweedfs/0.log" Apr 16 17:16:16.672147 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:16.672054 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-gcfs8" Apr 16 17:16:20.233833 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:20.233806 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-wqb46_b8070376-8817-4c60-a98b-631847c5de08/migrator/0.log" Apr 16 17:16:20.255538 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:20.255508 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-wqb46_b8070376-8817-4c60-a98b-631847c5de08/graceful-termination/0.log" Apr 16 17:16:21.666544 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:21.666516 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8jhjk_ce4c4ac0-c90c-484b-aa61-731d09fce8d3/kube-multus/0.log" Apr 16 17:16:21.829820 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:21.829767 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92gx2_74ade214-8512-4cf5-93e8-0ece0e5776f2/kube-multus-additional-cni-plugins/0.log" Apr 16 17:16:21.849157 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:21.849134 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92gx2_74ade214-8512-4cf5-93e8-0ece0e5776f2/egress-router-binary-copy/0.log" Apr 16 17:16:21.867527 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:21.867507 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92gx2_74ade214-8512-4cf5-93e8-0ece0e5776f2/cni-plugins/0.log" Apr 16 17:16:21.891561 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:21.891536 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92gx2_74ade214-8512-4cf5-93e8-0ece0e5776f2/bond-cni-plugin/0.log" Apr 16 17:16:21.910174 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:21.910155 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92gx2_74ade214-8512-4cf5-93e8-0ece0e5776f2/routeoverride-cni/0.log" Apr 16 17:16:21.930554 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:21.930500 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92gx2_74ade214-8512-4cf5-93e8-0ece0e5776f2/whereabouts-cni-bincopy/0.log" Apr 16 17:16:21.949857 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:21.949840 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92gx2_74ade214-8512-4cf5-93e8-0ece0e5776f2/whereabouts-cni/0.log" Apr 16 17:16:22.146649 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:22.146622 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dv6f9_890f4655-f936-4bb9-b82c-524efb501585/network-metrics-daemon/0.log" Apr 16 17:16:22.163016 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:22.162995 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dv6f9_890f4655-f936-4bb9-b82c-524efb501585/kube-rbac-proxy/0.log" Apr 16 17:16:23.259177 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:23.259145 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-controller/0.log" Apr 16 17:16:23.273809 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:23.273777 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/0.log" Apr 16 17:16:23.285890 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:23.285871 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovn-acl-logging/1.log" Apr 16 17:16:23.302173 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:23.302135 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/kube-rbac-proxy-node/0.log" Apr 16 17:16:23.323339 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:23.323319 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:16:23.341754 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:23.341699 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/northd/0.log" Apr 16 17:16:23.360470 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:23.360452 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/nbdb/0.log" Apr 16 17:16:23.379958 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:23.379938 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/sbdb/0.log" Apr 16 17:16:23.494476 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:23.494438 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brhp4_c0c5c0a0-29b2-4743-af7a-0c1150829a60/ovnkube-controller/0.log" Apr 16 17:16:24.948994 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:24.948967 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-s7xbf_feb3fcea-2282-411d-bb57-2562cc290f0a/network-check-target-container/0.log" Apr 16 17:16:25.919342 ip-10-0-137-126 kubenswrapper[2572]: I0416 17:16:25.919319 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-bpcgz_9f643739-4068-4891-858f-02df7c38bdb7/iptables-alerter/0.log"