Apr 16 16:20:11.925721 ip-10-0-130-165 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 16:20:11.925730 ip-10-0-130-165 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 16:20:11.925737 ip-10-0-130-165 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 16:20:11.925947 ip-10-0-130-165 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 16:20:21.994497 ip-10-0-130-165 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 16:20:21.994517 ip-10-0-130-165 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot c3a3879a04e14fabb575c71c84bef14b -- Apr 16 16:22:50.907830 ip-10-0-130-165 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:22:51.370205 ip-10-0-130-165 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:22:51.370205 ip-10-0-130-165 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:22:51.370205 ip-10-0-130-165 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:22:51.370205 ip-10-0-130-165 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:22:51.370205 ip-10-0-130-165 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:22:51.373434 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.373323 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:22:51.379686 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379651 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:22:51.379686 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379679 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:22:51.379686 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379684 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:22:51.379686 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379688 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:22:51.379686 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379692 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:22:51.379686 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379695 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:22:51.379686 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379698 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:22:51.379686 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379701 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379704 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379709 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379713 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379716 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379719 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379722 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379725 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379728 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379731 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379734 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379737 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379740 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379743 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379745 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379748 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379750 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379754 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379757 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:22:51.379998 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379759 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379763 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379765 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379768 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379770 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379773 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379776 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379781 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379784 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379786 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379789 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379792 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379794 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379797 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379800 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379803 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379806 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379809 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379812 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:22:51.380595 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379814 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379817 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379821 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379824 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379826 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379829 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379831 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379834 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379837 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379840 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379842 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379845 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379847 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379850 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379855 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379859 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379862 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379865 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379868 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379871 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:22:51.381114 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379873 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379876 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379879 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379881 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379884 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379886 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379889 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379893 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379896 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379899 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379901 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379904 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379907 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379911 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379914 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379917 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379920 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379923 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379926 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379928 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:22:51.381628 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.379931 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380370 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380375 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380379 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380382 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380385 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380387 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380390 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380393 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380395 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380398 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380401 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380404 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380406 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380409 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380412 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380414 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380417 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380419 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380422 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:22:51.382106 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380426 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380429 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380431 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380434 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380437 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380459 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380463 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380466 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380469 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380471 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380474 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380477 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380480 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380483 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380485 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380488 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380491 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380493 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380496 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380499 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:22:51.382597 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380501 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380504 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380507 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380511 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380514 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380516 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380519 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380521 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380525 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380528 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380531 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380533 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380537 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380540 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380543 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380546 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380548 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380558 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380563 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380567 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:22:51.383093 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380569 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380575 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380579 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380582 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380585 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380588 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380591 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380594 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380597 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380600 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380603 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380606 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380608 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380611 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380614 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380617 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380620 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380622 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380625 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380627 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:22:51.383596 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380630 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380633 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380636 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380638 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380643 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380646 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.380649 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382305 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382319 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382326 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382331 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382341 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382344 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382349 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382355 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382358 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382361 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382365 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382369 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382372 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382375 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382378 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382381 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:22:51.384088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382384 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382387 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382390 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382395 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382398 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382402 2577 flags.go:64] FLAG: --config-dir="" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382405 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382408 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382413 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382416 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382419 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382423 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382426 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382430 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382433 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382436 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382439 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382456 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382459 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382462 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382465 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382469 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.382472 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383290 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383294 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 16:22:51.384672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383297 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383301 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383305 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383309 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383313 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383316 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383319 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383322 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383325 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383328 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383331 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383334 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383338 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383341 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383345 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383348 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383351 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383355 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383359 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383362 2577 flags.go:64] FLAG: --help="false" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383365 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383368 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383371 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:22:51.385303 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383375 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383378 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383382 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383385 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383388 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383391 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383395 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383398 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383401 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383404 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383408 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383411 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383414 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383417 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383420 2577 flags.go:64] FLAG: --lock-file="" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383423 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383426 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383429 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383436 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383438 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383454 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383458 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383461 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383465 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:22:51.385885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383468 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383471 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383478 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383482 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383486 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383490 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383493 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383496 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383499 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383502 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383505 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383509 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383519 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383522 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383525 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383529 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383532 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383539 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383542 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383545 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383548 2577 flags.go:64] FLAG: --port="10250" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383551 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383554 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b7a1cbc21d7ff106" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383557 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:22:51.386526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383561 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383563 2577 flags.go:64] FLAG: --register-node="true" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383567 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383570 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383575 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383578 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383581 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383584 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383588 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383592 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383595 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383598 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383601 2577 flags.go:64] FLAG: --runonce="false" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383605 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383608 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383611 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383614 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383617 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383621 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383624 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383627 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383631 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383634 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383636 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383640 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383643 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:22:51.387117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383646 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383649 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383655 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383658 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383661 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383666 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383669 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383672 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383675 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383678 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383681 2577 flags.go:64] FLAG: --v="2" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383686 2577 flags.go:64] FLAG: --version="false" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383690 2577 flags.go:64] FLAG: --vmodule="" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383695 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.383698 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383808 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383811 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383821 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383824 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383829 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383831 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383834 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383837 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:22:51.387762 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383840 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383842 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383845 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383848 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383851 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383854 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383858 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383860 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383865 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383869 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383872 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383874 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383877 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383879 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383882 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383884 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383887 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383890 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383892 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:22:51.388353 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383895 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383897 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383900 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383903 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383905 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383908 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383911 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383915 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383917 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383920 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383923 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383925 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383928 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383930 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383933 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383936 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383938 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383941 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383944 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383946 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:22:51.388878 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383949 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383951 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383955 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383957 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383960 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383963 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383965 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383968 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383970 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383973 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383979 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383981 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383984 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383986 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383989 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383991 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383994 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383997 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.383999 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384003 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:22:51.389369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384006 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384009 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384011 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384014 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384017 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384019 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384022 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384026 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384030 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384033 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384037 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384039 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384042 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384045 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384048 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384053 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384056 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384059 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:22:51.389891 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.384061 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:22:51.390342 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.384860 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:22:51.391931 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.391905 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:22:51.391931 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.391932 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:22:51.392010 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.391987 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:22:51.392010 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.391992 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:22:51.392010 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.391996 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:22:51.392010 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.391999 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:22:51.392010 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392002 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:22:51.392010 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392005 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:22:51.392010 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392008 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:22:51.392010 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392011 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:22:51.392010 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392014 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392017 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392021 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392025 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392028 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392031 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392035 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392040 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392043 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392046 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392048 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392051 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392054 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392057 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392060 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392062 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392065 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392068 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392071 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:22:51.392237 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392073 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392076 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392078 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392081 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392085 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392087 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392090 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392092 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392095 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392097 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392100 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392102 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392105 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392107 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392110 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392114 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392117 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392119 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392122 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392125 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:22:51.392719 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392127 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392130 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392133 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392136 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392138 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392141 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392144 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392146 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392149 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392152 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392154 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392157 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392159 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392162 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392164 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392167 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392170 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392173 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392176 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:22:51.393243 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392178 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392181 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392183 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392186 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392189 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392191 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392195 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392198 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392201 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392204 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392206 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392209 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392211 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392214 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392217 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392220 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392222 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392225 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392227 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:22:51.393769 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392230 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.392235 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392354 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392360 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392363 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392366 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392369 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392372 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392375 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392378 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392381 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392383 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392386 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392389 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392391 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392394 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:22:51.394238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392397 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392400 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392405 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392407 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392410 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392413 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392416 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392418 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392421 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392423 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392426 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392428 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392431 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392434 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392436 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392439 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392458 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392461 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392464 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392466 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:22:51.394652 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392469 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392472 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392475 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392477 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392481 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392484 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392486 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392489 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392492 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392496 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392498 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392501 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392504 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392506 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392509 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392511 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392514 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392517 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392520 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392523 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:22:51.395145 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392525 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392529 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392533 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392536 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392539 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392542 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392544 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392547 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392556 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392561 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392564 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392567 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392569 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392572 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392575 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392577 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392581 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392584 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392586 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:22:51.395675 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392589 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392592 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392595 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392599 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392601 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392604 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392607 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392610 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392612 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392614 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392617 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392619 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:51.392622 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.392628 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:22:51.396133 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.393463 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:22:51.398818 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.398799 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:22:51.399845 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.399829 2577 server.go:1019] "Starting client certificate rotation" Apr 16 16:22:51.399958 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.399934 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:22:51.399990 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.399983 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:22:51.426998 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.426970 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:22:51.434525 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.434488 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:22:51.451264 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.451233 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:22:51.457184 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.457160 2577 log.go:25] "Validated CRI v1 image API" Apr 16 16:22:51.459586 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.459563 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:22:51.462566 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.462542 2577 fs.go:135] Filesystem UUIDs: map[61860168-ced7-4f8a-9205-772296b3e242:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 afc6c2e5-1336-4110-946a-dca14d00a1d5:/dev/nvme0n1p3] Apr 16 16:22:51.462647 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.462565 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:22:51.464874 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.464852 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:22:51.468865 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.468731 2577 manager.go:217] Machine: {Timestamp:2026-04-16 16:22:51.466520871 +0000 UTC m=+0.431301359 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096570 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c31569f34ab13156849ee15f9bdf9 SystemUUID:ec2c3156-9f34-ab13-1568-49ee15f9bdf9 BootID:c3a3879a-04e1-4fab-b575-c71c84bef14b Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:85:5c:8b:64:af Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:85:5c:8b:64:af Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:1d:90:82:1e:d8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:22:51.468865 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.468863 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:22:51.468982 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.468969 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:22:51.471668 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.471626 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:22:51.471836 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.471671 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-165.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:22:51.471889 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.471847 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:22:51.471889 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.471859 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:22:51.471889 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.471873 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:22:51.471889 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.471888 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:22:51.473235 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.473218 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:22:51.473382 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.473372 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:22:51.476031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.476016 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:22:51.476087 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.476046 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:22:51.476087 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.476060 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:22:51.476087 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.476072 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:22:51.476087 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.476081 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:22:51.477245 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.477231 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:22:51.477286 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.477253 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:22:51.480824 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.480802 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:22:51.482769 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.482754 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:22:51.484266 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484254 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:22:51.484302 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484271 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:22:51.484302 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484278 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:22:51.484302 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484283 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:22:51.484302 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484290 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:22:51.484302 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484302 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:22:51.484435 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484310 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:22:51.484435 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484315 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:22:51.484435 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484322 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:22:51.484435 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484328 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:22:51.484435 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484342 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:22:51.484435 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.484351 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:22:51.485255 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.485245 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:22:51.485255 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.485256 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:22:51.487285 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.487266 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xr4zw" Apr 16 16:22:51.489137 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.489120 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:22:51.489217 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.489160 2577 server.go:1295] "Started kubelet" Apr 16 16:22:51.489332 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.489303 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:22:51.489381 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.489293 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:22:51.489425 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.489392 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:22:51.490254 ip-10-0-130-165 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:22:51.494242 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.494213 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:22:51.494953 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.494932 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:22:51.495382 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.495359 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xr4zw" Apr 16 16:22:51.496800 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.496770 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-165.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:22:51.496993 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.496963 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:22:51.497088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.497057 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-165.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:22:51.501409 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.501377 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:22:51.501768 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.501742 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:22:51.501863 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.501827 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:22:51.502688 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.502500 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:22:51.502777 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.502694 2577 factory.go:55] Registering systemd factory Apr 16 16:22:51.502777 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.502708 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:22:51.502867 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.502818 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:51.503688 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.503663 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:22:51.503688 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.503687 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:22:51.503841 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.503828 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:22:51.503841 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.503841 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:22:51.503995 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.503965 2577 factory.go:153] Registering CRI-O factory Apr 16 16:22:51.503995 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.503979 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 16:22:51.504062 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.504046 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:22:51.504193 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.504066 2577 factory.go:103] Registering Raw factory Apr 16 16:22:51.504193 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.504083 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 16:22:51.504636 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.504596 2577 manager.go:319] Starting recovery of all containers Apr 16 16:22:51.511992 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.511840 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:22:51.515093 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.515063 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-165.ec2.internal\" not found" node="ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.516789 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.516770 2577 manager.go:324] Recovery completed Apr 16 16:22:51.521258 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.521240 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:51.524814 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.524792 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:51.524936 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.524827 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:51.524936 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.524842 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:51.525807 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.525785 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:22:51.525807 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.525806 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:22:51.525949 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.525831 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:22:51.528627 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.528606 2577 policy_none.go:49] "None policy: Start" Apr 16 16:22:51.528627 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.528630 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:22:51.528762 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.528641 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:22:51.567162 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.567129 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 16:22:51.567352 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.567176 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:22:51.567352 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.567189 2577 server.go:85] "Starting device plugin registration server" Apr 16 16:22:51.567740 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.567513 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:22:51.567740 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.567527 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:22:51.567740 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.567619 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:22:51.567740 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.567705 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:22:51.567740 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.567713 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:22:51.586352 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.568327 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:22:51.586352 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.568370 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:51.644674 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.644575 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:22:51.645924 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.645905 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:22:51.646038 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.645940 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:22:51.646038 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.645965 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:22:51.646038 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.645975 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:22:51.646038 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.646021 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:22:51.650418 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.650397 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:22:51.668507 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.668484 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:51.669770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.669753 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:51.669849 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.669785 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:51.669849 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.669796 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:51.669849 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.669827 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.676394 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.676374 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.676486 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.676402 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-165.ec2.internal\": node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:51.694515 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.694483 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:51.746380 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.746314 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-165.ec2.internal"] Apr 16 16:22:51.746519 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.746469 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:51.747536 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.747517 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:51.747638 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.747554 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:51.747638 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.747572 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:51.748887 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.748869 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:51.749034 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.749019 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.749085 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.749054 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:51.749718 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.749694 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:51.749809 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.749727 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:51.749809 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.749737 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:51.749809 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.749694 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:51.749809 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.749806 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:51.749928 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.749818 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:51.751395 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.751380 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.751477 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.751405 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:51.752161 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.752146 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:51.752220 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.752173 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:51.752220 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.752182 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:51.780395 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.780368 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-165.ec2.internal\" not found" node="ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.784912 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.784894 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-165.ec2.internal\" not found" node="ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.794605 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.794580 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:51.804828 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.804798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cfff45f070cd3f24f31d63385bd46a42-config\") pod \"kube-apiserver-proxy-ip-10-0-130-165.ec2.internal\" (UID: \"cfff45f070cd3f24f31d63385bd46a42\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.804917 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.804832 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bccc9b33d7859fe9c2c31fd9465d1b33-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal\" (UID: \"bccc9b33d7859fe9c2c31fd9465d1b33\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.804917 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.804857 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bccc9b33d7859fe9c2c31fd9465d1b33-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal\" (UID: \"bccc9b33d7859fe9c2c31fd9465d1b33\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.894932 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.894862 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:51.905221 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.905195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cfff45f070cd3f24f31d63385bd46a42-config\") pod \"kube-apiserver-proxy-ip-10-0-130-165.ec2.internal\" (UID: \"cfff45f070cd3f24f31d63385bd46a42\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.905335 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.905218 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cfff45f070cd3f24f31d63385bd46a42-config\") pod \"kube-apiserver-proxy-ip-10-0-130-165.ec2.internal\" (UID: \"cfff45f070cd3f24f31d63385bd46a42\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.905335 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.905241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bccc9b33d7859fe9c2c31fd9465d1b33-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal\" (UID: \"bccc9b33d7859fe9c2c31fd9465d1b33\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.905335 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.905273 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bccc9b33d7859fe9c2c31fd9465d1b33-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal\" (UID: \"bccc9b33d7859fe9c2c31fd9465d1b33\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.905335 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.905286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bccc9b33d7859fe9c2c31fd9465d1b33-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal\" (UID: \"bccc9b33d7859fe9c2c31fd9465d1b33\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.905335 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:51.905310 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bccc9b33d7859fe9c2c31fd9465d1b33-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal\" (UID: \"bccc9b33d7859fe9c2c31fd9465d1b33\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" Apr 16 16:22:51.995670 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:51.995633 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:52.084162 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.084126 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" Apr 16 16:22:52.087731 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.087713 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-165.ec2.internal" Apr 16 16:22:52.096752 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:52.096733 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:52.197317 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:52.197222 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:52.297869 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:52.297841 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:52.398511 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:52.398473 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:52.399573 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.399553 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:22:52.399711 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.399697 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:22:52.399748 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.399724 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:22:52.497516 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.497409 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:17:51 +0000 UTC" deadline="2027-10-07 20:39:46.416219103 +0000 UTC" Apr 16 16:22:52.497516 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.497464 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12940h16m53.918759749s" Apr 16 16:22:52.499173 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:52.499150 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:52.502112 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.502046 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:22:52.517261 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.517231 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:22:52.543241 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.543208 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qkx59" Apr 16 16:22:52.553577 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.553550 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qkx59" Apr 16 16:22:52.600103 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:52.600073 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-165.ec2.internal\" not found" Apr 16 16:22:52.602991 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:52.602951 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfff45f070cd3f24f31d63385bd46a42.slice/crio-9c5df2e94e2cc932ca278bee202153e07899b34a3745c0896ad0a98b738c05e1 WatchSource:0}: Error finding container 9c5df2e94e2cc932ca278bee202153e07899b34a3745c0896ad0a98b738c05e1: Status 404 returned error can't find the container with id 9c5df2e94e2cc932ca278bee202153e07899b34a3745c0896ad0a98b738c05e1 Apr 16 16:22:52.603162 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:52.603143 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbccc9b33d7859fe9c2c31fd9465d1b33.slice/crio-ac4b3e4679523f019c0cf8092bd0238128866fa083ca5a7f6beb794fb03e6762 WatchSource:0}: Error finding container ac4b3e4679523f019c0cf8092bd0238128866fa083ca5a7f6beb794fb03e6762: Status 404 returned error can't find the container with id ac4b3e4679523f019c0cf8092bd0238128866fa083ca5a7f6beb794fb03e6762 Apr 16 16:22:52.607813 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.607799 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:22:52.636081 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.636047 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:22:52.649106 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.649060 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" event={"ID":"bccc9b33d7859fe9c2c31fd9465d1b33","Type":"ContainerStarted","Data":"ac4b3e4679523f019c0cf8092bd0238128866fa083ca5a7f6beb794fb03e6762"} Apr 16 16:22:52.649901 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.649883 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-165.ec2.internal" event={"ID":"cfff45f070cd3f24f31d63385bd46a42","Type":"ContainerStarted","Data":"9c5df2e94e2cc932ca278bee202153e07899b34a3745c0896ad0a98b738c05e1"} Apr 16 16:22:52.694355 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.694156 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:22:52.702160 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.702136 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-165.ec2.internal" Apr 16 16:22:52.716895 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.716868 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:22:52.717874 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.717861 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" Apr 16 16:22:52.732594 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:52.732569 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:22:53.208882 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.208851 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:22:53.477681 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.477588 2577 apiserver.go:52] "Watching apiserver" Apr 16 16:22:53.486971 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.486942 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:22:53.489096 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.489057 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gq7bg","openshift-multus/multus-p6shp","openshift-multus/network-metrics-daemon-sdrp4","openshift-network-diagnostics/network-check-target-jpkws","kube-system/kube-apiserver-proxy-ip-10-0-130-165.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal","openshift-multus/multus-additional-cni-plugins-b587s","openshift-network-operator/iptables-alerter-2tqg9","openshift-ovn-kubernetes/ovnkube-node-hschh","kube-system/konnectivity-agent-rd84q","openshift-cluster-node-tuning-operator/tuned-dcxck","openshift-dns/node-resolver-7h5k5"] Apr 16 16:22:53.490884 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.490864 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.492243 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.492222 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.493647 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.493553 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:53.493647 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:53.493625 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:22:53.493834 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.493691 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:22:53.493978 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.493956 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:22:53.493978 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.493959 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:22:53.494206 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.494097 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:22:53.494899 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.494876 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:22:53.495001 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:53.494954 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:22:53.495467 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.495227 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:22:53.495467 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.495308 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gc99q\"" Apr 16 16:22:53.495467 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.495321 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:22:53.495467 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.495334 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:22:53.495467 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.495356 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:22:53.495772 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.495476 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:22:53.495772 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.495480 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mdr2r\"" Apr 16 16:22:53.495772 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.495503 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:22:53.496402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.496303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.497932 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.497909 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:22:53.498647 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.498621 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:22:53.498743 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.498675 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:22:53.500377 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.500101 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gsv58\"" Apr 16 16:22:53.501032 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.500944 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:22:53.501127 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.501046 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hd8q2\"" Apr 16 16:22:53.501183 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.501154 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:22:53.503106 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.501331 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:22:53.503106 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.502025 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.503106 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.502317 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7h5k5" Apr 16 16:22:53.504682 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.504649 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:22:53.504797 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.504778 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9hv2k\"" Apr 16 16:22:53.504857 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.504804 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:22:53.505067 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.505042 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:22:53.505177 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.505108 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:22:53.505546 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.505518 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.505682 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.505665 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-dnx9b\"" Apr 16 16:22:53.507245 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.507223 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2tqg9" Apr 16 16:22:53.508322 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.508303 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:22:53.508624 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.508570 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:22:53.508722 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.508679 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t7wn4\"" Apr 16 16:22:53.508722 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.508703 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gq7bg" Apr 16 16:22:53.509657 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.509634 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:22:53.509742 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.509668 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:22:53.510040 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.510021 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8hgpb\"" Apr 16 16:22:53.510123 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.510082 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:22:53.510786 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.510757 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:22:53.511168 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.511094 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:22:53.511168 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.511103 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:22:53.511315 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.511219 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-nx66x\"" Apr 16 16:22:53.513204 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-sysconfig\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.513294 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/71da194f-358e-449e-9a55-2882465c41ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.513294 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513282 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-run-openvswitch\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.513406 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-slash\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.513406 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-etc-kubernetes\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.513406 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513387 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/753dfb74-b65d-4c0b-b6d1-a0907d0024bc-konnectivity-ca\") pod \"konnectivity-agent-rd84q\" (UID: \"753dfb74-b65d-4c0b-b6d1-a0907d0024bc\") " pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:22:53.513568 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513414 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-modprobe-d\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.513568 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.513568 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-registration-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.513568 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513535 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-system-cni-dir\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.513568 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513563 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/71da194f-358e-449e-9a55-2882465c41ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.513787 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw9fm\" (UniqueName: \"kubernetes.io/projected/71da194f-358e-449e-9a55-2882465c41ef-kube-api-access-jw9fm\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.513787 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513626 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/937105e9-6cc7-458f-9b5c-007250aa5a6c-tmp-dir\") pod \"node-resolver-7h5k5\" (UID: \"937105e9-6cc7-458f-9b5c-007250aa5a6c\") " pod="openshift-dns/node-resolver-7h5k5" Apr 16 16:22:53.513787 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-var-lib-kubelet\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.513787 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513745 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-etc-selinux\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.513787 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-etc-openvswitch\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514033 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513790 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-run-ovn\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514033 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513848 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514033 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513889 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-system-cni-dir\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.514033 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-hostroot\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.514033 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/652350aa-d2fc-4c32-bc1b-e593db927908-ovn-node-metrics-cert\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514033 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-os-release\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.514033 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.513987 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-conf-dir\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.514033 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-daemon-config\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.514402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514036 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-systemd-units\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514069 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-cni-bin\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htjz4\" (UniqueName: \"kubernetes.io/projected/652350aa-d2fc-4c32-bc1b-e593db927908-kube-api-access-htjz4\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514133 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn2sr\" (UniqueName: \"kubernetes.io/projected/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-kube-api-access-kn2sr\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.514402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-run-netns\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-node-log\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-run-netns\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.514402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514270 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-cnibin\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.514402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pqz2\" (UniqueName: \"kubernetes.io/projected/858151a3-bcef-4b9a-94c3-32bd1f0db177-kube-api-access-8pqz2\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:53.514402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-socket-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.514402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514371 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-lib-modules\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-tmp\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514468 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-var-lib-cni-multus\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514493 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clhqw\" (UniqueName: \"kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw\") pod \"network-check-target-jpkws\" (UID: \"d0d1cd03-838f-49df-b77e-5eb6e1a96deb\") " pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514517 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-var-lib-cni-bin\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-run\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514595 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-tuned\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gkw5\" (UniqueName: \"kubernetes.io/projected/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-kube-api-access-2gkw5\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514659 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/652350aa-d2fc-4c32-bc1b-e593db927908-env-overrides\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514699 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-var-lib-openvswitch\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514732 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-cni-netd\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-cnibin\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c6f4643-0f15-43f3-b51e-e048015bf431-cni-binary-copy\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-var-lib-kubelet\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.514827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-run-multus-certs\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/937105e9-6cc7-458f-9b5c-007250aa5a6c-hosts-file\") pod \"node-resolver-7h5k5\" (UID: \"937105e9-6cc7-458f-9b5c-007250aa5a6c\") " pod="openshift-dns/node-resolver-7h5k5" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-sysctl-d\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-systemd\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514930 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-sysctl-conf\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-device-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.514977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-log-socket\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515000 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/652350aa-d2fc-4c32-bc1b-e593db927908-ovnkube-config\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/652350aa-d2fc-4c32-bc1b-e593db927908-ovnkube-script-lib\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-run-k8s-cni-cncf-io\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-host\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-cni-dir\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/753dfb74-b65d-4c0b-b6d1-a0907d0024bc-agent-certs\") pod \"konnectivity-agent-rd84q\" (UID: \"753dfb74-b65d-4c0b-b6d1-a0907d0024bc\") " pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7jdk\" (UniqueName: \"kubernetes.io/projected/937105e9-6cc7-458f-9b5c-007250aa5a6c-kube-api-access-g7jdk\") pod \"node-resolver-7h5k5\" (UID: \"937105e9-6cc7-458f-9b5c-007250aa5a6c\") " pod="openshift-dns/node-resolver-7h5k5" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-sys-fs\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.515293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515264 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-kubelet\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.515926 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515289 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k4kb\" (UniqueName: \"kubernetes.io/projected/3c6f4643-0f15-43f3-b51e-e048015bf431-kube-api-access-2k4kb\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.515926 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-os-release\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.515926 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515336 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71da194f-358e-449e-9a55-2882465c41ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.515926 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515374 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-run-systemd\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.515926 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515405 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-run-ovn-kubernetes\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.515926 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515427 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:53.515926 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-kubernetes\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.515926 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-sys\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.515926 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.515506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-socket-dir-parent\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.554208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.554173 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:17:52 +0000 UTC" deadline="2027-12-09 04:18:05.907641979 +0000 UTC" Apr 16 16:22:53.554208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.554207 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14435h55m12.353438436s" Apr 16 16:22:53.604109 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.604070 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:22:53.616348 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616308 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-os-release\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.616348 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71da194f-358e-449e-9a55-2882465c41ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-run-systemd\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-run-systemd\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616459 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-os-release\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-run-ovn-kubernetes\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-kubernetes\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-sys\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-run-ovn-kubernetes\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616587 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-socket-dir-parent\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-sysconfig\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616636 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-socket-dir-parent\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.616662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/71da194f-358e-449e-9a55-2882465c41ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-sys\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:53.616693 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616722 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-run-openvswitch\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616717 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-kubernetes\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616688 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-run-openvswitch\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616732 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-sysconfig\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:53.616801 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs podName:858151a3-bcef-4b9a-94c3-32bd1f0db177 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:54.116761923 +0000 UTC m=+3.081542415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs") pod "network-metrics-daemon-sdrp4" (UID: "858151a3-bcef-4b9a-94c3-32bd1f0db177") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/994d74ed-a014-4bb9-9549-70f76b64ca30-iptables-alerter-script\") pod \"iptables-alerter-2tqg9\" (UID: \"994d74ed-a014-4bb9-9549-70f76b64ca30\") " pod="openshift-network-operator/iptables-alerter-2tqg9" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616875 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/994d74ed-a014-4bb9-9549-70f76b64ca30-host-slash\") pod \"iptables-alerter-2tqg9\" (UID: \"994d74ed-a014-4bb9-9549-70f76b64ca30\") " pod="openshift-network-operator/iptables-alerter-2tqg9" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-slash\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-etc-kubernetes\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616980 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-slash\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.616991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/753dfb74-b65d-4c0b-b6d1-a0907d0024bc-konnectivity-ca\") pod \"konnectivity-agent-rd84q\" (UID: \"753dfb74-b65d-4c0b-b6d1-a0907d0024bc\") " pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-modprobe-d\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617033 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-etc-kubernetes\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.617223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-registration-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617116 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-system-cni-dir\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617069 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71da194f-358e-449e-9a55-2882465c41ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/71da194f-358e-449e-9a55-2882465c41ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617158 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-modprobe-d\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw9fm\" (UniqueName: \"kubernetes.io/projected/71da194f-358e-449e-9a55-2882465c41ef-kube-api-access-jw9fm\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-registration-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/937105e9-6cc7-458f-9b5c-007250aa5a6c-tmp-dir\") pod \"node-resolver-7h5k5\" (UID: \"937105e9-6cc7-458f-9b5c-007250aa5a6c\") " pod="openshift-dns/node-resolver-7h5k5" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617261 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-system-cni-dir\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/71da194f-358e-449e-9a55-2882465c41ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-var-lib-kubelet\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-etc-selinux\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-var-lib-kubelet\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-etc-openvswitch\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617370 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-run-ovn\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-etc-selinux\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.617980 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617400 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-system-cni-dir\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-hostroot\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/652350aa-d2fc-4c32-bc1b-e593db927908-ovn-node-metrics-cert\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-os-release\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617570 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/753dfb74-b65d-4c0b-b6d1-a0907d0024bc-konnectivity-ca\") pod \"konnectivity-agent-rd84q\" (UID: \"753dfb74-b65d-4c0b-b6d1-a0907d0024bc\") " pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617588 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-os-release\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617566 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-conf-dir\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617587 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/937105e9-6cc7-458f-9b5c-007250aa5a6c-tmp-dir\") pod \"node-resolver-7h5k5\" (UID: \"937105e9-6cc7-458f-9b5c-007250aa5a6c\") " pod="openshift-dns/node-resolver-7h5k5" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-conf-dir\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617636 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-hostroot\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-etc-openvswitch\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-daemon-config\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617672 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-system-cni-dir\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617681 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/71da194f-358e-449e-9a55-2882465c41ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617747 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-run-ovn\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2l5q\" (UniqueName: \"kubernetes.io/projected/994d74ed-a014-4bb9-9549-70f76b64ca30-kube-api-access-n2l5q\") pod \"iptables-alerter-2tqg9\" (UID: \"994d74ed-a014-4bb9-9549-70f76b64ca30\") " pod="openshift-network-operator/iptables-alerter-2tqg9" Apr 16 16:22:53.618770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617796 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5a35ec4-25f4-4c5b-8175-23e377d3e9b3-host\") pod \"node-ca-gq7bg\" (UID: \"b5a35ec4-25f4-4c5b-8175-23e377d3e9b3\") " pod="openshift-image-registry/node-ca-gq7bg" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lspjf\" (UniqueName: \"kubernetes.io/projected/b5a35ec4-25f4-4c5b-8175-23e377d3e9b3-kube-api-access-lspjf\") pod \"node-ca-gq7bg\" (UID: \"b5a35ec4-25f4-4c5b-8175-23e377d3e9b3\") " pod="openshift-image-registry/node-ca-gq7bg" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-systemd-units\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-cni-bin\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htjz4\" (UniqueName: \"kubernetes.io/projected/652350aa-d2fc-4c32-bc1b-e593db927908-kube-api-access-htjz4\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-systemd-units\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kn2sr\" (UniqueName: \"kubernetes.io/projected/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-kube-api-access-kn2sr\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-cni-bin\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617967 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-run-netns\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.617994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-node-log\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-run-netns\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618019 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618034 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-run-netns\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618042 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-cnibin\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-cnibin\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pqz2\" (UniqueName: \"kubernetes.io/projected/858151a3-bcef-4b9a-94c3-32bd1f0db177-kube-api-access-8pqz2\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-socket-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618120 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-run-netns\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.619606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618150 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-lib-modules\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-daemon-config\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-tmp\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618245 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-socket-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-var-lib-cni-multus\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-lib-modules\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618303 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-var-lib-cni-multus\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clhqw\" (UniqueName: \"kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw\") pod \"network-check-target-jpkws\" (UID: \"d0d1cd03-838f-49df-b77e-5eb6e1a96deb\") " pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618335 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-var-lib-cni-bin\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5a35ec4-25f4-4c5b-8175-23e377d3e9b3-serviceca\") pod \"node-ca-gq7bg\" (UID: \"b5a35ec4-25f4-4c5b-8175-23e377d3e9b3\") " pod="openshift-image-registry/node-ca-gq7bg" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618378 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-run\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618403 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-tuned\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gkw5\" (UniqueName: \"kubernetes.io/projected/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-kube-api-access-2gkw5\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618459 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-run\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/652350aa-d2fc-4c32-bc1b-e593db927908-env-overrides\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-var-lib-openvswitch\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-cni-netd\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618545 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-cnibin\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.620576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c6f4643-0f15-43f3-b51e-e048015bf431-cni-binary-copy\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-var-lib-kubelet\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-node-log\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-run-multus-certs\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/937105e9-6cc7-458f-9b5c-007250aa5a6c-hosts-file\") pod \"node-resolver-7h5k5\" (UID: \"937105e9-6cc7-458f-9b5c-007250aa5a6c\") " pod="openshift-dns/node-resolver-7h5k5" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618650 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-cni-netd\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-sysctl-d\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618713 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-systemd\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-sysctl-conf\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-var-lib-cni-bin\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-device-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618840 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-device-dir\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-var-lib-openvswitch\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618856 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-log-socket\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618894 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-cnibin\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/652350aa-d2fc-4c32-bc1b-e593db927908-ovnkube-config\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618937 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/652350aa-d2fc-4c32-bc1b-e593db927908-ovnkube-script-lib\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618956 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/937105e9-6cc7-458f-9b5c-007250aa5a6c-hosts-file\") pod \"node-resolver-7h5k5\" (UID: \"937105e9-6cc7-458f-9b5c-007250aa5a6c\") " pod="openshift-dns/node-resolver-7h5k5" Apr 16 16:22:53.621408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618963 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-run-k8s-cni-cncf-io\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-sysctl-d\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.618999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-host\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/652350aa-d2fc-4c32-bc1b-e593db927908-env-overrides\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-cni-dir\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619085 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/753dfb74-b65d-4c0b-b6d1-a0907d0024bc-agent-certs\") pod \"konnectivity-agent-rd84q\" (UID: \"753dfb74-b65d-4c0b-b6d1-a0907d0024bc\") " pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619100 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-var-lib-kubelet\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-run-multus-certs\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-host\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-sysctl-conf\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7jdk\" (UniqueName: \"kubernetes.io/projected/937105e9-6cc7-458f-9b5c-007250aa5a6c-kube-api-access-g7jdk\") pod \"node-resolver-7h5k5\" (UID: \"937105e9-6cc7-458f-9b5c-007250aa5a6c\") " pod="openshift-dns/node-resolver-7h5k5" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619202 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-sys-fs\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-kubelet\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619273 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k4kb\" (UniqueName: \"kubernetes.io/projected/3c6f4643-0f15-43f3-b51e-e048015bf431-kube-api-access-2k4kb\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619283 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-log-socket\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619056 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-systemd\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619363 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-sys-fs\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.622188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619392 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/652350aa-d2fc-4c32-bc1b-e593db927908-host-kubelet\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.622817 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-multus-cni-dir\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.622817 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619392 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c6f4643-0f15-43f3-b51e-e048015bf431-cni-binary-copy\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.622817 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619434 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c6f4643-0f15-43f3-b51e-e048015bf431-host-run-k8s-cni-cncf-io\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.622817 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71da194f-358e-449e-9a55-2882465c41ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.622817 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.619830 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/652350aa-d2fc-4c32-bc1b-e593db927908-ovnkube-script-lib\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.622817 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.620110 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/652350aa-d2fc-4c32-bc1b-e593db927908-ovnkube-config\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.622817 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.621661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-tmp\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.622817 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.621769 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/652350aa-d2fc-4c32-bc1b-e593db927908-ovn-node-metrics-cert\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.622817 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.621986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-etc-tuned\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.622817 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.622404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/753dfb74-b65d-4c0b-b6d1-a0907d0024bc-agent-certs\") pod \"konnectivity-agent-rd84q\" (UID: \"753dfb74-b65d-4c0b-b6d1-a0907d0024bc\") " pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:22:53.628427 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:53.628067 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:22:53.628427 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:53.628098 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:22:53.628427 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:53.628114 2577 projected.go:194] Error preparing data for projected volume kube-api-access-clhqw for pod openshift-network-diagnostics/network-check-target-jpkws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:53.628427 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:53.628190 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw podName:d0d1cd03-838f-49df-b77e-5eb6e1a96deb nodeName:}" failed. No retries permitted until 2026-04-16 16:22:54.128167693 +0000 UTC m=+3.092948199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-clhqw" (UniqueName: "kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw") pod "network-check-target-jpkws" (UID: "d0d1cd03-838f-49df-b77e-5eb6e1a96deb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:53.631154 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.631120 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k4kb\" (UniqueName: \"kubernetes.io/projected/3c6f4643-0f15-43f3-b51e-e048015bf431-kube-api-access-2k4kb\") pod \"multus-p6shp\" (UID: \"3c6f4643-0f15-43f3-b51e-e048015bf431\") " pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.631154 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.631136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pqz2\" (UniqueName: \"kubernetes.io/projected/858151a3-bcef-4b9a-94c3-32bd1f0db177-kube-api-access-8pqz2\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:53.632074 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.631602 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw9fm\" (UniqueName: \"kubernetes.io/projected/71da194f-358e-449e-9a55-2882465c41ef-kube-api-access-jw9fm\") pod \"multus-additional-cni-plugins-b587s\" (UID: \"71da194f-358e-449e-9a55-2882465c41ef\") " pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.632074 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.631610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gkw5\" (UniqueName: \"kubernetes.io/projected/4ea338ac-fe3a-449a-a4ba-c8d631e6b043-kube-api-access-2gkw5\") pod \"tuned-dcxck\" (UID: \"4ea338ac-fe3a-449a-a4ba-c8d631e6b043\") " pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.632074 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.632035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htjz4\" (UniqueName: \"kubernetes.io/projected/652350aa-d2fc-4c32-bc1b-e593db927908-kube-api-access-htjz4\") pod \"ovnkube-node-hschh\" (UID: \"652350aa-d2fc-4c32-bc1b-e593db927908\") " pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.632293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.632177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7jdk\" (UniqueName: \"kubernetes.io/projected/937105e9-6cc7-458f-9b5c-007250aa5a6c-kube-api-access-g7jdk\") pod \"node-resolver-7h5k5\" (UID: \"937105e9-6cc7-458f-9b5c-007250aa5a6c\") " pod="openshift-dns/node-resolver-7h5k5" Apr 16 16:22:53.632860 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.632839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn2sr\" (UniqueName: \"kubernetes.io/projected/c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64-kube-api-access-kn2sr\") pod \"aws-ebs-csi-driver-node-f9ngr\" (UID: \"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.720260 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.719909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2l5q\" (UniqueName: \"kubernetes.io/projected/994d74ed-a014-4bb9-9549-70f76b64ca30-kube-api-access-n2l5q\") pod \"iptables-alerter-2tqg9\" (UID: \"994d74ed-a014-4bb9-9549-70f76b64ca30\") " pod="openshift-network-operator/iptables-alerter-2tqg9" Apr 16 16:22:53.720260 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.719976 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5a35ec4-25f4-4c5b-8175-23e377d3e9b3-host\") pod \"node-ca-gq7bg\" (UID: \"b5a35ec4-25f4-4c5b-8175-23e377d3e9b3\") " pod="openshift-image-registry/node-ca-gq7bg" Apr 16 16:22:53.720260 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.720018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lspjf\" (UniqueName: \"kubernetes.io/projected/b5a35ec4-25f4-4c5b-8175-23e377d3e9b3-kube-api-access-lspjf\") pod \"node-ca-gq7bg\" (UID: \"b5a35ec4-25f4-4c5b-8175-23e377d3e9b3\") " pod="openshift-image-registry/node-ca-gq7bg" Apr 16 16:22:53.720260 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.720078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5a35ec4-25f4-4c5b-8175-23e377d3e9b3-serviceca\") pod \"node-ca-gq7bg\" (UID: \"b5a35ec4-25f4-4c5b-8175-23e377d3e9b3\") " pod="openshift-image-registry/node-ca-gq7bg" Apr 16 16:22:53.720260 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.720159 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/994d74ed-a014-4bb9-9549-70f76b64ca30-iptables-alerter-script\") pod \"iptables-alerter-2tqg9\" (UID: \"994d74ed-a014-4bb9-9549-70f76b64ca30\") " pod="openshift-network-operator/iptables-alerter-2tqg9" Apr 16 16:22:53.720260 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.720193 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/994d74ed-a014-4bb9-9549-70f76b64ca30-host-slash\") pod \"iptables-alerter-2tqg9\" (UID: \"994d74ed-a014-4bb9-9549-70f76b64ca30\") " pod="openshift-network-operator/iptables-alerter-2tqg9" Apr 16 16:22:53.720742 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.720358 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/994d74ed-a014-4bb9-9549-70f76b64ca30-host-slash\") pod \"iptables-alerter-2tqg9\" (UID: \"994d74ed-a014-4bb9-9549-70f76b64ca30\") " pod="openshift-network-operator/iptables-alerter-2tqg9" Apr 16 16:22:53.721544 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.720878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5a35ec4-25f4-4c5b-8175-23e377d3e9b3-serviceca\") pod \"node-ca-gq7bg\" (UID: \"b5a35ec4-25f4-4c5b-8175-23e377d3e9b3\") " pod="openshift-image-registry/node-ca-gq7bg" Apr 16 16:22:53.721544 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.720916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5a35ec4-25f4-4c5b-8175-23e377d3e9b3-host\") pod \"node-ca-gq7bg\" (UID: \"b5a35ec4-25f4-4c5b-8175-23e377d3e9b3\") " pod="openshift-image-registry/node-ca-gq7bg" Apr 16 16:22:53.721544 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.721329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/994d74ed-a014-4bb9-9549-70f76b64ca30-iptables-alerter-script\") pod \"iptables-alerter-2tqg9\" (UID: \"994d74ed-a014-4bb9-9549-70f76b64ca30\") " pod="openshift-network-operator/iptables-alerter-2tqg9" Apr 16 16:22:53.729726 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.729657 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lspjf\" (UniqueName: \"kubernetes.io/projected/b5a35ec4-25f4-4c5b-8175-23e377d3e9b3-kube-api-access-lspjf\") pod \"node-ca-gq7bg\" (UID: \"b5a35ec4-25f4-4c5b-8175-23e377d3e9b3\") " pod="openshift-image-registry/node-ca-gq7bg" Apr 16 16:22:53.729726 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.729713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2l5q\" (UniqueName: \"kubernetes.io/projected/994d74ed-a014-4bb9-9549-70f76b64ca30-kube-api-access-n2l5q\") pod \"iptables-alerter-2tqg9\" (UID: \"994d74ed-a014-4bb9-9549-70f76b64ca30\") " pod="openshift-network-operator/iptables-alerter-2tqg9" Apr 16 16:22:53.802156 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.802121 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:22:53.804431 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.804401 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:22:53.814375 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.814339 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p6shp" Apr 16 16:22:53.822199 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.822174 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" Apr 16 16:22:53.827867 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.827833 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:22:53.835608 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.835569 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dcxck" Apr 16 16:22:53.842349 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.842315 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7h5k5" Apr 16 16:22:53.850116 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.850085 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b587s" Apr 16 16:22:53.856870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.856837 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2tqg9" Apr 16 16:22:53.862601 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:53.862575 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gq7bg" Apr 16 16:22:54.123619 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.123493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:54.123768 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:54.123663 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:54.123768 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:54.123747 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs podName:858151a3-bcef-4b9a-94c3-32bd1f0db177 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:55.123728624 +0000 UTC m=+4.088509113 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs") pod "network-metrics-daemon-sdrp4" (UID: "858151a3-bcef-4b9a-94c3-32bd1f0db177") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:54.224137 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.224111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clhqw\" (UniqueName: \"kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw\") pod \"network-check-target-jpkws\" (UID: \"d0d1cd03-838f-49df-b77e-5eb6e1a96deb\") " pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:22:54.224264 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:54.224248 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:22:54.224305 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:54.224270 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:22:54.224305 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:54.224279 2577 projected.go:194] Error preparing data for projected volume kube-api-access-clhqw for pod openshift-network-diagnostics/network-check-target-jpkws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:54.224389 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:54.224327 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw podName:d0d1cd03-838f-49df-b77e-5eb6e1a96deb nodeName:}" failed. No retries permitted until 2026-04-16 16:22:55.224312356 +0000 UTC m=+4.189092830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-clhqw" (UniqueName: "kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw") pod "network-check-target-jpkws" (UID: "d0d1cd03-838f-49df-b77e-5eb6e1a96deb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:54.249901 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:54.249869 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c6f4643_0f15_43f3_b51e_e048015bf431.slice/crio-7b88bf21fb25dbbf57a8379623adec716c889fe54ba5909fc1e645d4bdb19dd0 WatchSource:0}: Error finding container 7b88bf21fb25dbbf57a8379623adec716c889fe54ba5909fc1e645d4bdb19dd0: Status 404 returned error can't find the container with id 7b88bf21fb25dbbf57a8379623adec716c889fe54ba5909fc1e645d4bdb19dd0 Apr 16 16:22:54.252081 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:54.252052 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc22ec3e6_b1ed_430d_be1f_d7a2c8fe1a64.slice/crio-d3fbae8588cb7d65cf51954090edee5266fe5e471e5f31ac140a0876e2172d7a WatchSource:0}: Error finding container d3fbae8588cb7d65cf51954090edee5266fe5e471e5f31ac140a0876e2172d7a: Status 404 returned error can't find the container with id d3fbae8588cb7d65cf51954090edee5266fe5e471e5f31ac140a0876e2172d7a Apr 16 16:22:54.258347 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:54.258315 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5a35ec4_25f4_4c5b_8175_23e377d3e9b3.slice/crio-3b0f38fbdd80e603b8d229f8b84217b08d030320d6edeb29e713805e8620b896 WatchSource:0}: Error finding container 3b0f38fbdd80e603b8d229f8b84217b08d030320d6edeb29e713805e8620b896: Status 404 returned error can't find the container with id 3b0f38fbdd80e603b8d229f8b84217b08d030320d6edeb29e713805e8620b896 Apr 16 16:22:54.258744 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:54.258712 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71da194f_358e_449e_9a55_2882465c41ef.slice/crio-7cf89c1b1676ee2fadd38eb9022d1047a34ac94c53dde53dc8da49a471bf7f5c WatchSource:0}: Error finding container 7cf89c1b1676ee2fadd38eb9022d1047a34ac94c53dde53dc8da49a471bf7f5c: Status 404 returned error can't find the container with id 7cf89c1b1676ee2fadd38eb9022d1047a34ac94c53dde53dc8da49a471bf7f5c Apr 16 16:22:54.259115 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:54.259093 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod994d74ed_a014_4bb9_9549_70f76b64ca30.slice/crio-c82673c4bc99804b9b302103fc519d5f45291510350b4766a5bb3118122f1b43 WatchSource:0}: Error finding container c82673c4bc99804b9b302103fc519d5f45291510350b4766a5bb3118122f1b43: Status 404 returned error can't find the container with id c82673c4bc99804b9b302103fc519d5f45291510350b4766a5bb3118122f1b43 Apr 16 16:22:54.260993 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:54.260741 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ea338ac_fe3a_449a_a4ba_c8d631e6b043.slice/crio-facbc702d60f561cda5b546956d91cacf00621064f0821111de69a4c8245907a WatchSource:0}: Error finding container facbc702d60f561cda5b546956d91cacf00621064f0821111de69a4c8245907a: Status 404 returned error can't find the container with id facbc702d60f561cda5b546956d91cacf00621064f0821111de69a4c8245907a Apr 16 16:22:54.262303 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:54.261922 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod753dfb74_b65d_4c0b_b6d1_a0907d0024bc.slice/crio-6878e6623e07472b894644c0178d92d021da1e0b9555b3db89eba06f6f26d77f WatchSource:0}: Error finding container 6878e6623e07472b894644c0178d92d021da1e0b9555b3db89eba06f6f26d77f: Status 404 returned error can't find the container with id 6878e6623e07472b894644c0178d92d021da1e0b9555b3db89eba06f6f26d77f Apr 16 16:22:54.262693 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:54.262668 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod652350aa_d2fc_4c32_bc1b_e593db927908.slice/crio-1e5d59067419650ab0fc743ba87c99893c47723537291fc4b33c4370d7f6e63d WatchSource:0}: Error finding container 1e5d59067419650ab0fc743ba87c99893c47723537291fc4b33c4370d7f6e63d: Status 404 returned error can't find the container with id 1e5d59067419650ab0fc743ba87c99893c47723537291fc4b33c4370d7f6e63d Apr 16 16:22:54.264657 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:22:54.264220 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod937105e9_6cc7_458f_9b5c_007250aa5a6c.slice/crio-1e769f92529f04bf7abb16afd7c3c22ff4e12dfe41ada0a69162624274772462 WatchSource:0}: Error finding container 1e769f92529f04bf7abb16afd7c3c22ff4e12dfe41ada0a69162624274772462: Status 404 returned error can't find the container with id 1e769f92529f04bf7abb16afd7c3c22ff4e12dfe41ada0a69162624274772462 Apr 16 16:22:54.554684 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.554366 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:17:52 +0000 UTC" deadline="2027-11-30 04:38:39.418596691 +0000 UTC" Apr 16 16:22:54.554684 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.554604 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14220h15m44.863998932s" Apr 16 16:22:54.654265 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.654230 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7h5k5" event={"ID":"937105e9-6cc7-458f-9b5c-007250aa5a6c","Type":"ContainerStarted","Data":"1e769f92529f04bf7abb16afd7c3c22ff4e12dfe41ada0a69162624274772462"} Apr 16 16:22:54.656968 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.656922 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dcxck" event={"ID":"4ea338ac-fe3a-449a-a4ba-c8d631e6b043","Type":"ContainerStarted","Data":"facbc702d60f561cda5b546956d91cacf00621064f0821111de69a4c8245907a"} Apr 16 16:22:54.660001 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.659963 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2tqg9" event={"ID":"994d74ed-a014-4bb9-9549-70f76b64ca30","Type":"ContainerStarted","Data":"c82673c4bc99804b9b302103fc519d5f45291510350b4766a5bb3118122f1b43"} Apr 16 16:22:54.662077 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.662048 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gq7bg" event={"ID":"b5a35ec4-25f4-4c5b-8175-23e377d3e9b3","Type":"ContainerStarted","Data":"3b0f38fbdd80e603b8d229f8b84217b08d030320d6edeb29e713805e8620b896"} Apr 16 16:22:54.664036 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.664002 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-165.ec2.internal" event={"ID":"cfff45f070cd3f24f31d63385bd46a42","Type":"ContainerStarted","Data":"3acaf074b9bd2874a51919f6eec431b599248965b03b24be1b204bac796cda7b"} Apr 16 16:22:54.665469 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.665430 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" event={"ID":"652350aa-d2fc-4c32-bc1b-e593db927908","Type":"ContainerStarted","Data":"1e5d59067419650ab0fc743ba87c99893c47723537291fc4b33c4370d7f6e63d"} Apr 16 16:22:54.666697 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.666675 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rd84q" event={"ID":"753dfb74-b65d-4c0b-b6d1-a0907d0024bc","Type":"ContainerStarted","Data":"6878e6623e07472b894644c0178d92d021da1e0b9555b3db89eba06f6f26d77f"} Apr 16 16:22:54.668635 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.668610 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b587s" event={"ID":"71da194f-358e-449e-9a55-2882465c41ef","Type":"ContainerStarted","Data":"7cf89c1b1676ee2fadd38eb9022d1047a34ac94c53dde53dc8da49a471bf7f5c"} Apr 16 16:22:54.671848 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.671822 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" event={"ID":"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64","Type":"ContainerStarted","Data":"d3fbae8588cb7d65cf51954090edee5266fe5e471e5f31ac140a0876e2172d7a"} Apr 16 16:22:54.674201 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.674150 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p6shp" event={"ID":"3c6f4643-0f15-43f3-b51e-e048015bf431","Type":"ContainerStarted","Data":"7b88bf21fb25dbbf57a8379623adec716c889fe54ba5909fc1e645d4bdb19dd0"} Apr 16 16:22:54.680722 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:54.680569 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-165.ec2.internal" podStartSLOduration=2.6805568920000002 podStartE2EDuration="2.680556892s" podCreationTimestamp="2026-04-16 16:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:22:54.679803754 +0000 UTC m=+3.644584261" watchObservedRunningTime="2026-04-16 16:22:54.680556892 +0000 UTC m=+3.645337388" Apr 16 16:22:55.133398 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:55.133365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:55.133685 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:55.133551 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:55.133685 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:55.133626 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs podName:858151a3-bcef-4b9a-94c3-32bd1f0db177 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:57.133604706 +0000 UTC m=+6.098385195 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs") pod "network-metrics-daemon-sdrp4" (UID: "858151a3-bcef-4b9a-94c3-32bd1f0db177") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:55.234681 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:55.234594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clhqw\" (UniqueName: \"kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw\") pod \"network-check-target-jpkws\" (UID: \"d0d1cd03-838f-49df-b77e-5eb6e1a96deb\") " pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:22:55.234852 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:55.234755 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:22:55.234852 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:55.234773 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:22:55.234852 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:55.234786 2577 projected.go:194] Error preparing data for projected volume kube-api-access-clhqw for pod openshift-network-diagnostics/network-check-target-jpkws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:55.235001 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:55.234859 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw podName:d0d1cd03-838f-49df-b77e-5eb6e1a96deb nodeName:}" failed. No retries permitted until 2026-04-16 16:22:57.234834378 +0000 UTC m=+6.199614859 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-clhqw" (UniqueName: "kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw") pod "network-check-target-jpkws" (UID: "d0d1cd03-838f-49df-b77e-5eb6e1a96deb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:55.647992 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:55.647202 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:55.647992 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:55.647351 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:22:55.647992 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:55.647835 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:22:55.647992 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:55.647932 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:22:55.688435 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:55.688397 2577 generic.go:358] "Generic (PLEG): container finished" podID="bccc9b33d7859fe9c2c31fd9465d1b33" containerID="4032ced643bd697e526bd941bb8f780a91852eb4d732ca1a0fd2a0536fb46faa" exitCode=0 Apr 16 16:22:55.688624 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:55.688547 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" event={"ID":"bccc9b33d7859fe9c2c31fd9465d1b33","Type":"ContainerDied","Data":"4032ced643bd697e526bd941bb8f780a91852eb4d732ca1a0fd2a0536fb46faa"} Apr 16 16:22:56.707365 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:56.707295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" event={"ID":"bccc9b33d7859fe9c2c31fd9465d1b33","Type":"ContainerStarted","Data":"3a354af7f6ad1ca6fb88b72012037a0a02861c7f7de4ff4f6b8a5b88bc8c5ac7"} Apr 16 16:22:57.149011 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:57.148316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:57.149011 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:57.148521 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:57.149011 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:57.148589 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs podName:858151a3-bcef-4b9a-94c3-32bd1f0db177 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:01.148569628 +0000 UTC m=+10.113350113 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs") pod "network-metrics-daemon-sdrp4" (UID: "858151a3-bcef-4b9a-94c3-32bd1f0db177") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:57.249736 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:57.249696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clhqw\" (UniqueName: \"kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw\") pod \"network-check-target-jpkws\" (UID: \"d0d1cd03-838f-49df-b77e-5eb6e1a96deb\") " pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:22:57.249924 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:57.249902 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:22:57.250002 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:57.249932 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:22:57.250002 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:57.249945 2577 projected.go:194] Error preparing data for projected volume kube-api-access-clhqw for pod openshift-network-diagnostics/network-check-target-jpkws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:57.250101 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:57.250013 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw podName:d0d1cd03-838f-49df-b77e-5eb6e1a96deb nodeName:}" failed. No retries permitted until 2026-04-16 16:23:01.249993733 +0000 UTC m=+10.214774214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-clhqw" (UniqueName: "kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw") pod "network-check-target-jpkws" (UID: "d0d1cd03-838f-49df-b77e-5eb6e1a96deb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:57.647042 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:57.647006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:22:57.647210 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:57.647144 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:22:57.649537 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:57.649508 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:57.649660 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:57.649639 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:22:59.646617 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:59.646577 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:22:59.647057 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:59.646705 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:22:59.647125 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:22:59.647105 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:22:59.647233 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:22:59.647210 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:01.184232 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:01.184190 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:01.184736 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:01.184381 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:01.184736 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:01.184483 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs podName:858151a3-bcef-4b9a-94c3-32bd1f0db177 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:09.184460739 +0000 UTC m=+18.149241225 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs") pod "network-metrics-daemon-sdrp4" (UID: "858151a3-bcef-4b9a-94c3-32bd1f0db177") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:01.284785 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:01.284745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clhqw\" (UniqueName: \"kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw\") pod \"network-check-target-jpkws\" (UID: \"d0d1cd03-838f-49df-b77e-5eb6e1a96deb\") " pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:01.284984 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:01.284960 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:01.285055 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:01.284991 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:01.285055 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:01.285005 2577 projected.go:194] Error preparing data for projected volume kube-api-access-clhqw for pod openshift-network-diagnostics/network-check-target-jpkws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:01.285155 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:01.285065 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw podName:d0d1cd03-838f-49df-b77e-5eb6e1a96deb nodeName:}" failed. No retries permitted until 2026-04-16 16:23:09.285045644 +0000 UTC m=+18.249826125 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-clhqw" (UniqueName: "kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw") pod "network-check-target-jpkws" (UID: "d0d1cd03-838f-49df-b77e-5eb6e1a96deb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:01.647906 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:01.647803 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:01.648096 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:01.647937 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:01.648096 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:01.647978 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:01.648096 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:01.648048 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:03.646526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:03.646485 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:03.646995 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:03.646531 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:03.646995 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:03.646623 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:03.646995 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:03.646743 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:05.647152 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:05.647111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:05.647613 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:05.647111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:05.647613 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:05.647279 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:05.647613 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:05.647340 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:07.647196 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:07.647105 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:07.647668 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:07.647262 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:07.647668 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:07.647353 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:07.647668 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:07.647476 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:09.243719 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:09.243662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:09.244220 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:09.243852 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:09.244220 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:09.243944 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs podName:858151a3-bcef-4b9a-94c3-32bd1f0db177 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:25.243922212 +0000 UTC m=+34.208702691 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs") pod "network-metrics-daemon-sdrp4" (UID: "858151a3-bcef-4b9a-94c3-32bd1f0db177") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:09.345035 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:09.344992 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clhqw\" (UniqueName: \"kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw\") pod \"network-check-target-jpkws\" (UID: \"d0d1cd03-838f-49df-b77e-5eb6e1a96deb\") " pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:09.345218 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:09.345177 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:09.345218 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:09.345203 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:09.345218 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:09.345213 2577 projected.go:194] Error preparing data for projected volume kube-api-access-clhqw for pod openshift-network-diagnostics/network-check-target-jpkws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:09.345343 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:09.345275 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw podName:d0d1cd03-838f-49df-b77e-5eb6e1a96deb nodeName:}" failed. No retries permitted until 2026-04-16 16:23:25.345254906 +0000 UTC m=+34.310035402 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-clhqw" (UniqueName: "kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw") pod "network-check-target-jpkws" (UID: "d0d1cd03-838f-49df-b77e-5eb6e1a96deb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:09.647204 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:09.647120 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:09.647204 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:09.647159 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:09.647428 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:09.647248 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:09.647428 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:09.647390 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:11.647734 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:11.647693 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:11.648309 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:11.647802 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:11.648309 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:11.647853 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:11.648309 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:11.647980 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:12.738674 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.736932 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7h5k5" event={"ID":"937105e9-6cc7-458f-9b5c-007250aa5a6c","Type":"ContainerStarted","Data":"ecd062ca8ef5b367da1780f4a64d11a53751cc19850f4e20efb7b62333159290"} Apr 16 16:23:12.741421 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.741384 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dcxck" event={"ID":"4ea338ac-fe3a-449a-a4ba-c8d631e6b043","Type":"ContainerStarted","Data":"90c2fd831639ab3aa09eb2fccc1c750fdd42e17e89448cb127b757a5e4150dbc"} Apr 16 16:23:12.742891 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.742862 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gq7bg" event={"ID":"b5a35ec4-25f4-4c5b-8175-23e377d3e9b3","Type":"ContainerStarted","Data":"736a714e54bb842a8ff8cfe09a0d480b629b6efed5a8f43c2d30628b04e3f563"} Apr 16 16:23:12.745532 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.745510 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:23:12.745858 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.745834 2577 generic.go:358] "Generic (PLEG): container finished" podID="652350aa-d2fc-4c32-bc1b-e593db927908" containerID="8e076044af919bddeb489a1aea0fd253f2cfe09490f1540e400bf9b048b5c7b4" exitCode=1 Apr 16 16:23:12.745935 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.745899 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" event={"ID":"652350aa-d2fc-4c32-bc1b-e593db927908","Type":"ContainerStarted","Data":"63bc892fa0f7f921a49d5f629c159252cbdd44bba45704f7c2ab1839fd7cd18d"} Apr 16 16:23:12.745988 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.745935 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" event={"ID":"652350aa-d2fc-4c32-bc1b-e593db927908","Type":"ContainerStarted","Data":"d4b491945cab85423aed65f62d73800174104a5b852627c697cfbb840a748922"} Apr 16 16:23:12.745988 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.745948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" event={"ID":"652350aa-d2fc-4c32-bc1b-e593db927908","Type":"ContainerStarted","Data":"ac1830057c96dd1966b92b7692818c9ca6f768a0cddc0f219e396c0e0759b8d8"} Apr 16 16:23:12.745988 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.745960 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" event={"ID":"652350aa-d2fc-4c32-bc1b-e593db927908","Type":"ContainerStarted","Data":"fb2b058f1a057129ddedde90625e117e2fb77e74b79e9a74c8314b2114ecc164"} Apr 16 16:23:12.745988 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.745971 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" event={"ID":"652350aa-d2fc-4c32-bc1b-e593db927908","Type":"ContainerDied","Data":"8e076044af919bddeb489a1aea0fd253f2cfe09490f1540e400bf9b048b5c7b4"} Apr 16 16:23:12.745988 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.745987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" event={"ID":"652350aa-d2fc-4c32-bc1b-e593db927908","Type":"ContainerStarted","Data":"9c3482fad3cd6e135d3d70b14baf020af8a0cf5098a365aab65ea6973f61a0e8"} Apr 16 16:23:12.747280 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.747231 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rd84q" event={"ID":"753dfb74-b65d-4c0b-b6d1-a0907d0024bc","Type":"ContainerStarted","Data":"f5dffdb09ab4417f9fabc8478ae69a66f7a0cbbafdfbae690f7f0e0db381d4af"} Apr 16 16:23:12.748629 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.748600 2577 generic.go:358] "Generic (PLEG): container finished" podID="71da194f-358e-449e-9a55-2882465c41ef" containerID="0375e020a87ecf776a74358d7fd17705b41b6f57d9248545e198db53b438ba77" exitCode=0 Apr 16 16:23:12.748722 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.748649 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b587s" event={"ID":"71da194f-358e-449e-9a55-2882465c41ef","Type":"ContainerDied","Data":"0375e020a87ecf776a74358d7fd17705b41b6f57d9248545e198db53b438ba77"} Apr 16 16:23:12.750018 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.749994 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" event={"ID":"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64","Type":"ContainerStarted","Data":"c2ae532ac72b667a2ca2dd8bdf9160f1a3fb0c78dcc61a490e2bcdd7069112dd"} Apr 16 16:23:12.751479 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.751425 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p6shp" event={"ID":"3c6f4643-0f15-43f3-b51e-e048015bf431","Type":"ContainerStarted","Data":"0c8b4cd1592aceff4bf82d8976765950b0dbe0f7fdbf8e5a61db095821639873"} Apr 16 16:23:12.755029 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.754986 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-165.ec2.internal" podStartSLOduration=20.754972542 podStartE2EDuration="20.754972542s" podCreationTimestamp="2026-04-16 16:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:22:56.724637857 +0000 UTC m=+5.689418353" watchObservedRunningTime="2026-04-16 16:23:12.754972542 +0000 UTC m=+21.719753037" Apr 16 16:23:12.755479 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.755434 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7h5k5" podStartSLOduration=4.247800289 podStartE2EDuration="21.75542748s" podCreationTimestamp="2026-04-16 16:22:51 +0000 UTC" firstStartedPulling="2026-04-16 16:22:54.266149235 +0000 UTC m=+3.230929716" lastFinishedPulling="2026-04-16 16:23:11.773776431 +0000 UTC m=+20.738556907" observedRunningTime="2026-04-16 16:23:12.754682066 +0000 UTC m=+21.719462565" watchObservedRunningTime="2026-04-16 16:23:12.75542748 +0000 UTC m=+21.720207975" Apr 16 16:23:12.780809 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.780756 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dcxck" podStartSLOduration=4.267957258 podStartE2EDuration="21.780739964s" podCreationTimestamp="2026-04-16 16:22:51 +0000 UTC" firstStartedPulling="2026-04-16 16:22:54.262628552 +0000 UTC m=+3.227409039" lastFinishedPulling="2026-04-16 16:23:11.775411267 +0000 UTC m=+20.740191745" observedRunningTime="2026-04-16 16:23:12.77976459 +0000 UTC m=+21.744545087" watchObservedRunningTime="2026-04-16 16:23:12.780739964 +0000 UTC m=+21.745520459" Apr 16 16:23:12.797299 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.797197 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gq7bg" podStartSLOduration=3.284544617 podStartE2EDuration="20.797182761s" podCreationTimestamp="2026-04-16 16:22:52 +0000 UTC" firstStartedPulling="2026-04-16 16:22:54.261038848 +0000 UTC m=+3.225819323" lastFinishedPulling="2026-04-16 16:23:11.773676986 +0000 UTC m=+20.738457467" observedRunningTime="2026-04-16 16:23:12.796299874 +0000 UTC m=+21.761080360" watchObservedRunningTime="2026-04-16 16:23:12.797182761 +0000 UTC m=+21.761963253" Apr 16 16:23:12.846917 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.846854 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-p6shp" podStartSLOduration=4.295213244 podStartE2EDuration="21.846833814s" podCreationTimestamp="2026-04-16 16:22:51 +0000 UTC" firstStartedPulling="2026-04-16 16:22:54.252975511 +0000 UTC m=+3.217755988" lastFinishedPulling="2026-04-16 16:23:11.804596084 +0000 UTC m=+20.769376558" observedRunningTime="2026-04-16 16:23:12.845891047 +0000 UTC m=+21.810671543" watchObservedRunningTime="2026-04-16 16:23:12.846833814 +0000 UTC m=+21.811614310" Apr 16 16:23:12.867049 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:12.866990 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rd84q" podStartSLOduration=4.35703009 podStartE2EDuration="21.866970628s" podCreationTimestamp="2026-04-16 16:22:51 +0000 UTC" firstStartedPulling="2026-04-16 16:22:54.263826099 +0000 UTC m=+3.228606590" lastFinishedPulling="2026-04-16 16:23:11.77376664 +0000 UTC m=+20.738547128" observedRunningTime="2026-04-16 16:23:12.866615216 +0000 UTC m=+21.831395711" watchObservedRunningTime="2026-04-16 16:23:12.866970628 +0000 UTC m=+21.831751125" Apr 16 16:23:13.070456 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:13.070416 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:23:13.578670 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:13.578467 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:23:13.070433885Z","UUID":"b1feb284-3d6f-407c-bf65-5b7e0e4e2dfc","Handler":null,"Name":"","Endpoint":""} Apr 16 16:23:13.580414 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:13.580387 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:23:13.580414 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:13.580420 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:23:13.647007 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:13.646974 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:13.647184 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:13.646985 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:13.647184 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:13.647128 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:13.647281 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:13.647183 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:13.755196 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:13.755146 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" event={"ID":"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64","Type":"ContainerStarted","Data":"9afba53ab90ca1883394bb949df5b2f474d4aa460134578c829060e1dd6dff02"} Apr 16 16:23:13.756694 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:13.756661 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2tqg9" event={"ID":"994d74ed-a014-4bb9-9549-70f76b64ca30","Type":"ContainerStarted","Data":"f5af177e9b0d831d1fbb844a8af1372e57e6517292ede95a52f42f4e726f31ce"} Apr 16 16:23:13.774152 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:13.774088 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2tqg9" podStartSLOduration=4.262090964 podStartE2EDuration="21.774071705s" podCreationTimestamp="2026-04-16 16:22:52 +0000 UTC" firstStartedPulling="2026-04-16 16:22:54.261874185 +0000 UTC m=+3.226654674" lastFinishedPulling="2026-04-16 16:23:11.773854927 +0000 UTC m=+20.738635415" observedRunningTime="2026-04-16 16:23:13.773612603 +0000 UTC m=+22.738393100" watchObservedRunningTime="2026-04-16 16:23:13.774071705 +0000 UTC m=+22.738852201" Apr 16 16:23:14.760388 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:14.760348 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" event={"ID":"c22ec3e6-b1ed-430d-be1f-d7a2c8fe1a64","Type":"ContainerStarted","Data":"bf196048e1d73958a8130d656d6b4b0dfb6850225bd272b5202cea202827f159"} Apr 16 16:23:14.763295 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:14.763273 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:23:14.763661 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:14.763635 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" event={"ID":"652350aa-d2fc-4c32-bc1b-e593db927908","Type":"ContainerStarted","Data":"89cd950298a551d518d8b2557eca7cf925d43d5fd0842fc491084c1dd8f0573f"} Apr 16 16:23:14.780187 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:14.780136 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9ngr" podStartSLOduration=4.024650731 podStartE2EDuration="23.7801212s" podCreationTimestamp="2026-04-16 16:22:51 +0000 UTC" firstStartedPulling="2026-04-16 16:22:54.254198695 +0000 UTC m=+3.218979169" lastFinishedPulling="2026-04-16 16:23:14.009669149 +0000 UTC m=+22.974449638" observedRunningTime="2026-04-16 16:23:14.779282387 +0000 UTC m=+23.744062900" watchObservedRunningTime="2026-04-16 16:23:14.7801212 +0000 UTC m=+23.744901691" Apr 16 16:23:15.647062 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:15.646830 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:15.647238 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:15.646936 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:15.647238 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:15.647187 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:15.647337 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:15.647309 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:16.871683 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:16.871648 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:23:16.872749 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:16.872730 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:23:17.042750 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.042513 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-zh4s9"] Apr 16 16:23:17.044424 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.044405 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:17.044531 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:17.044513 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zh4s9" podUID="23450ee2-cf03-4966-b11a-bec44507a72d" Apr 16 16:23:17.108526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.108475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/23450ee2-cf03-4966-b11a-bec44507a72d-kubelet-config\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:17.108526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.108527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/23450ee2-cf03-4966-b11a-bec44507a72d-dbus\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:17.108693 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.108614 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:17.209054 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.209011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:17.209204 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.209073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/23450ee2-cf03-4966-b11a-bec44507a72d-kubelet-config\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:17.209204 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.209097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/23450ee2-cf03-4966-b11a-bec44507a72d-dbus\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:17.209204 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:17.209174 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:23:17.209321 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.209210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/23450ee2-cf03-4966-b11a-bec44507a72d-kubelet-config\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:17.209321 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:17.209242 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret podName:23450ee2-cf03-4966-b11a-bec44507a72d nodeName:}" failed. No retries permitted until 2026-04-16 16:23:17.70922712 +0000 UTC m=+26.674007599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret") pod "global-pull-secret-syncer-zh4s9" (UID: "23450ee2-cf03-4966-b11a-bec44507a72d") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:23:17.209321 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.209299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/23450ee2-cf03-4966-b11a-bec44507a72d-dbus\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:17.646368 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.646334 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:17.646368 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.646352 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:17.646593 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:17.646465 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:17.646640 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:17.646596 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:17.712022 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.711983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:17.712193 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:17.712147 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:23:17.712230 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:17.712212 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret podName:23450ee2-cf03-4966-b11a-bec44507a72d nodeName:}" failed. No retries permitted until 2026-04-16 16:23:18.71219803 +0000 UTC m=+27.676978504 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret") pod "global-pull-secret-syncer-zh4s9" (UID: "23450ee2-cf03-4966-b11a-bec44507a72d") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:23:17.773495 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.773468 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:23:17.773840 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.773811 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" event={"ID":"652350aa-d2fc-4c32-bc1b-e593db927908","Type":"ContainerStarted","Data":"bea5eee868e9772a902443b84dd4755e822582a2fd3988bb7a73cacd6825b66c"} Apr 16 16:23:17.774122 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.774098 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:23:17.774122 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.774123 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:23:17.774482 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.774462 2577 scope.go:117] "RemoveContainer" containerID="8e076044af919bddeb489a1aea0fd253f2cfe09490f1540e400bf9b048b5c7b4" Apr 16 16:23:17.775555 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.775532 2577 generic.go:358] "Generic (PLEG): container finished" podID="71da194f-358e-449e-9a55-2882465c41ef" containerID="1fe273424bceedb0887a410c77b3cf70c9032b518e0a35ed1e8f6e97da4955ec" exitCode=0 Apr 16 16:23:17.775641 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.775618 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b587s" event={"ID":"71da194f-358e-449e-9a55-2882465c41ef","Type":"ContainerDied","Data":"1fe273424bceedb0887a410c77b3cf70c9032b518e0a35ed1e8f6e97da4955ec"} Apr 16 16:23:17.775835 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.775814 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:23:17.776364 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.776309 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rd84q" Apr 16 16:23:17.790371 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:17.790273 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:23:18.647073 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.647036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:18.647516 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:18.647148 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zh4s9" podUID="23450ee2-cf03-4966-b11a-bec44507a72d" Apr 16 16:23:18.690996 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.690967 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sdrp4"] Apr 16 16:23:18.691132 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.691073 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:18.691201 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:18.691179 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:18.691615 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.691588 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jpkws"] Apr 16 16:23:18.691732 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.691683 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:18.691794 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:18.691760 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:18.695857 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.695830 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zh4s9"] Apr 16 16:23:18.719752 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.719704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:18.719911 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:18.719855 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:23:18.719967 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:18.719919 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret podName:23450ee2-cf03-4966-b11a-bec44507a72d nodeName:}" failed. No retries permitted until 2026-04-16 16:23:20.719900353 +0000 UTC m=+29.684680840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret") pod "global-pull-secret-syncer-zh4s9" (UID: "23450ee2-cf03-4966-b11a-bec44507a72d") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:23:18.783216 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.783188 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:23:18.783651 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.783625 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:18.783763 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.783625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" event={"ID":"652350aa-d2fc-4c32-bc1b-e593db927908","Type":"ContainerStarted","Data":"e6a0c440532698c45d4a82e9862e092202bfc813c6bc092be4f2b0654e0d515e"} Apr 16 16:23:18.783763 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:18.783741 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zh4s9" podUID="23450ee2-cf03-4966-b11a-bec44507a72d" Apr 16 16:23:18.784028 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.784006 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:23:18.799153 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.799127 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:23:18.816469 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:18.816386 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" podStartSLOduration=10.162471453 podStartE2EDuration="27.816366324s" podCreationTimestamp="2026-04-16 16:22:51 +0000 UTC" firstStartedPulling="2026-04-16 16:22:54.264856517 +0000 UTC m=+3.229636996" lastFinishedPulling="2026-04-16 16:23:11.918751389 +0000 UTC m=+20.883531867" observedRunningTime="2026-04-16 16:23:18.814870449 +0000 UTC m=+27.779650945" watchObservedRunningTime="2026-04-16 16:23:18.816366324 +0000 UTC m=+27.781146821" Apr 16 16:23:19.787462 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:19.787417 2577 generic.go:358] "Generic (PLEG): container finished" podID="71da194f-358e-449e-9a55-2882465c41ef" containerID="1d6eca03dbcbabb60a1dfd61a4ea74556833f6f921cc3dfebafd05f5f86165ef" exitCode=0 Apr 16 16:23:19.787848 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:19.787483 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b587s" event={"ID":"71da194f-358e-449e-9a55-2882465c41ef","Type":"ContainerDied","Data":"1d6eca03dbcbabb60a1dfd61a4ea74556833f6f921cc3dfebafd05f5f86165ef"} Apr 16 16:23:20.646927 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:20.646892 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:20.646927 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:20.646927 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:20.647134 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:20.646906 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:20.647134 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:20.647023 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zh4s9" podUID="23450ee2-cf03-4966-b11a-bec44507a72d" Apr 16 16:23:20.647134 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:20.647104 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:20.647236 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:20.647198 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:20.733665 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:20.733626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:20.733813 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:20.733778 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:23:20.733872 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:20.733847 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret podName:23450ee2-cf03-4966-b11a-bec44507a72d nodeName:}" failed. No retries permitted until 2026-04-16 16:23:24.733832354 +0000 UTC m=+33.698612833 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret") pod "global-pull-secret-syncer-zh4s9" (UID: "23450ee2-cf03-4966-b11a-bec44507a72d") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:23:20.794078 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:20.794028 2577 generic.go:358] "Generic (PLEG): container finished" podID="71da194f-358e-449e-9a55-2882465c41ef" containerID="a3b6a225049318b3fa4b961bb75db6a310ad0d2ac2e31a1533b4690c17803951" exitCode=0 Apr 16 16:23:20.794515 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:20.794112 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b587s" event={"ID":"71da194f-358e-449e-9a55-2882465c41ef","Type":"ContainerDied","Data":"a3b6a225049318b3fa4b961bb75db6a310ad0d2ac2e31a1533b4690c17803951"} Apr 16 16:23:22.647317 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:22.647117 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:22.647826 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:22.647117 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:22.647826 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:22.647417 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:22.647826 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:22.647518 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:22.647826 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:22.647117 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:22.647826 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:22.647646 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zh4s9" podUID="23450ee2-cf03-4966-b11a-bec44507a72d" Apr 16 16:23:24.646911 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.646872 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:24.647504 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:24.646997 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zh4s9" podUID="23450ee2-cf03-4966-b11a-bec44507a72d" Apr 16 16:23:24.647504 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.646872 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:24.647504 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.646872 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:24.647504 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:24.647141 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sdrp4" podUID="858151a3-bcef-4b9a-94c3-32bd1f0db177" Apr 16 16:23:24.647504 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:24.647237 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jpkws" podUID="d0d1cd03-838f-49df-b77e-5eb6e1a96deb" Apr 16 16:23:24.762280 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.762194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:24.762425 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:24.762350 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:23:24.762425 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:24.762408 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret podName:23450ee2-cf03-4966-b11a-bec44507a72d nodeName:}" failed. No retries permitted until 2026-04-16 16:23:32.762391708 +0000 UTC m=+41.727172182 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret") pod "global-pull-secret-syncer-zh4s9" (UID: "23450ee2-cf03-4966-b11a-bec44507a72d") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:23:24.817697 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.817668 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-165.ec2.internal" event="NodeReady" Apr 16 16:23:24.817862 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.817796 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:23:24.873084 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.873054 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pfv5k"] Apr 16 16:23:24.907136 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.907107 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z5t69"] Apr 16 16:23:24.907308 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.907288 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:24.909812 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.909790 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:23:24.909970 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.909954 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:23:24.910073 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.909953 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2bd8s\"" Apr 16 16:23:24.922981 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.922959 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pfv5k"] Apr 16 16:23:24.923072 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.922989 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z5t69"] Apr 16 16:23:24.923125 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.923072 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:24.925531 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.925329 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:23:24.925531 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.925412 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:23:24.925531 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.925439 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrwds\"" Apr 16 16:23:24.925531 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:24.925467 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:23:25.064290 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.064201 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8016a568-6fe9-4dfc-a543-f50b2768e5b2-config-volume\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.064290 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.064244 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzxkv\" (UniqueName: \"kubernetes.io/projected/8016a568-6fe9-4dfc-a543-f50b2768e5b2-kube-api-access-dzxkv\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.064290 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.064274 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:25.064585 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.064323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8016a568-6fe9-4dfc-a543-f50b2768e5b2-tmp-dir\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.064585 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.064363 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.064585 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.064463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9lv9\" (UniqueName: \"kubernetes.io/projected/461b689e-a41b-4182-ba52-e26a1dfbc007-kube-api-access-f9lv9\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:25.165486 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.165424 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.165669 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.165501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9lv9\" (UniqueName: \"kubernetes.io/projected/461b689e-a41b-4182-ba52-e26a1dfbc007-kube-api-access-f9lv9\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:25.165669 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.165561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8016a568-6fe9-4dfc-a543-f50b2768e5b2-config-volume\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.165669 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.165580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzxkv\" (UniqueName: \"kubernetes.io/projected/8016a568-6fe9-4dfc-a543-f50b2768e5b2-kube-api-access-dzxkv\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.165669 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.165588 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:25.165669 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.165662 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls podName:8016a568-6fe9-4dfc-a543-f50b2768e5b2 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:25.665642027 +0000 UTC m=+34.630422501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls") pod "dns-default-pfv5k" (UID: "8016a568-6fe9-4dfc-a543-f50b2768e5b2") : secret "dns-default-metrics-tls" not found Apr 16 16:23:25.165918 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.165598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:25.165918 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.165755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8016a568-6fe9-4dfc-a543-f50b2768e5b2-tmp-dir\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.165918 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.165688 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:25.165918 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.165845 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert podName:461b689e-a41b-4182-ba52-e26a1dfbc007 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:25.66582705 +0000 UTC m=+34.630607549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert") pod "ingress-canary-z5t69" (UID: "461b689e-a41b-4182-ba52-e26a1dfbc007") : secret "canary-serving-cert" not found Apr 16 16:23:25.166123 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.166090 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8016a568-6fe9-4dfc-a543-f50b2768e5b2-tmp-dir\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.166191 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.166172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8016a568-6fe9-4dfc-a543-f50b2768e5b2-config-volume\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.177326 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.177301 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzxkv\" (UniqueName: \"kubernetes.io/projected/8016a568-6fe9-4dfc-a543-f50b2768e5b2-kube-api-access-dzxkv\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.177487 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.177466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9lv9\" (UniqueName: \"kubernetes.io/projected/461b689e-a41b-4182-ba52-e26a1dfbc007-kube-api-access-f9lv9\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:25.266503 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.266468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:25.266656 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.266608 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:25.266715 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.266684 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs podName:858151a3-bcef-4b9a-94c3-32bd1f0db177 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:57.266663767 +0000 UTC m=+66.231444246 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs") pod "network-metrics-daemon-sdrp4" (UID: "858151a3-bcef-4b9a-94c3-32bd1f0db177") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:25.367292 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.367209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clhqw\" (UniqueName: \"kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw\") pod \"network-check-target-jpkws\" (UID: \"d0d1cd03-838f-49df-b77e-5eb6e1a96deb\") " pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:25.367468 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.367358 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:25.367468 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.367387 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:25.367468 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.367401 2577 projected.go:194] Error preparing data for projected volume kube-api-access-clhqw for pod openshift-network-diagnostics/network-check-target-jpkws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:25.367589 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.367485 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw podName:d0d1cd03-838f-49df-b77e-5eb6e1a96deb nodeName:}" failed. No retries permitted until 2026-04-16 16:23:57.367464989 +0000 UTC m=+66.332245471 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-clhqw" (UniqueName: "kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw") pod "network-check-target-jpkws" (UID: "d0d1cd03-838f-49df-b77e-5eb6e1a96deb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:25.669339 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.669305 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:25.670042 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:25.669397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:25.670042 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.669423 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:25.670042 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.669498 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls podName:8016a568-6fe9-4dfc-a543-f50b2768e5b2 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:26.669480332 +0000 UTC m=+35.634260822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls") pod "dns-default-pfv5k" (UID: "8016a568-6fe9-4dfc-a543-f50b2768e5b2") : secret "dns-default-metrics-tls" not found Apr 16 16:23:25.670042 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.669556 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:25.670042 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:25.669610 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert podName:461b689e-a41b-4182-ba52-e26a1dfbc007 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:26.669594387 +0000 UTC m=+35.634374864 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert") pod "ingress-canary-z5t69" (UID: "461b689e-a41b-4182-ba52-e26a1dfbc007") : secret "canary-serving-cert" not found Apr 16 16:23:26.647039 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:26.647010 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:26.647197 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:26.647010 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:26.647197 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:26.647010 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:26.650331 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:26.650311 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:23:26.650331 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:26.650325 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:23:26.650479 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:26.650312 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:23:26.651395 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:26.651377 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:23:26.651496 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:26.651423 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pm5r6\"" Apr 16 16:23:26.651496 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:26.651439 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wtfd7\"" Apr 16 16:23:26.676903 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:26.676880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:26.677165 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:26.676964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:26.677165 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:26.677014 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:26.677165 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:26.677068 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls podName:8016a568-6fe9-4dfc-a543-f50b2768e5b2 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:28.677053179 +0000 UTC m=+37.641833653 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls") pod "dns-default-pfv5k" (UID: "8016a568-6fe9-4dfc-a543-f50b2768e5b2") : secret "dns-default-metrics-tls" not found Apr 16 16:23:26.677165 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:26.677070 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:26.677165 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:26.677113 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert podName:461b689e-a41b-4182-ba52-e26a1dfbc007 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:28.677098981 +0000 UTC m=+37.641879463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert") pod "ingress-canary-z5t69" (UID: "461b689e-a41b-4182-ba52-e26a1dfbc007") : secret "canary-serving-cert" not found Apr 16 16:23:27.810350 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:27.810313 2577 generic.go:358] "Generic (PLEG): container finished" podID="71da194f-358e-449e-9a55-2882465c41ef" containerID="accbdeb990e95684b6522d06c3b6e148588b465161a34ae3853a0e854eaadacb" exitCode=0 Apr 16 16:23:27.810971 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:27.810384 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b587s" event={"ID":"71da194f-358e-449e-9a55-2882465c41ef","Type":"ContainerDied","Data":"accbdeb990e95684b6522d06c3b6e148588b465161a34ae3853a0e854eaadacb"} Apr 16 16:23:28.689849 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:28.689805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:28.690017 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:28.689875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:28.690017 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:28.689957 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:28.690017 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:28.689975 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:28.690116 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:28.690023 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls podName:8016a568-6fe9-4dfc-a543-f50b2768e5b2 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:32.690007148 +0000 UTC m=+41.654787621 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls") pod "dns-default-pfv5k" (UID: "8016a568-6fe9-4dfc-a543-f50b2768e5b2") : secret "dns-default-metrics-tls" not found Apr 16 16:23:28.690116 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:28.690039 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert podName:461b689e-a41b-4182-ba52-e26a1dfbc007 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:32.690032768 +0000 UTC m=+41.654813242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert") pod "ingress-canary-z5t69" (UID: "461b689e-a41b-4182-ba52-e26a1dfbc007") : secret "canary-serving-cert" not found Apr 16 16:23:28.814617 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:28.814588 2577 generic.go:358] "Generic (PLEG): container finished" podID="71da194f-358e-449e-9a55-2882465c41ef" containerID="26296e1ad4f633dc8b49445c1a2cfa818f8c8c39c3154be8559f60b3c34ee9d5" exitCode=0 Apr 16 16:23:28.814981 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:28.814630 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b587s" event={"ID":"71da194f-358e-449e-9a55-2882465c41ef","Type":"ContainerDied","Data":"26296e1ad4f633dc8b49445c1a2cfa818f8c8c39c3154be8559f60b3c34ee9d5"} Apr 16 16:23:29.818954 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:29.818769 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b587s" event={"ID":"71da194f-358e-449e-9a55-2882465c41ef","Type":"ContainerStarted","Data":"fc718c652b7a2a3f93da141eeb14698c93c70975262791ea8944e9f350407d67"} Apr 16 16:23:29.847753 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:29.847698 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b587s" podStartSLOduration=5.423106968 podStartE2EDuration="37.847680992s" podCreationTimestamp="2026-04-16 16:22:52 +0000 UTC" firstStartedPulling="2026-04-16 16:22:54.261398947 +0000 UTC m=+3.226179421" lastFinishedPulling="2026-04-16 16:23:26.685972972 +0000 UTC m=+35.650753445" observedRunningTime="2026-04-16 16:23:29.846608069 +0000 UTC m=+38.811388567" watchObservedRunningTime="2026-04-16 16:23:29.847680992 +0000 UTC m=+38.812461489" Apr 16 16:23:32.720358 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:32.720314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:32.720789 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:32.720402 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:32.720789 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:32.720484 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:32.720789 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:32.720520 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:32.720789 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:32.720549 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls podName:8016a568-6fe9-4dfc-a543-f50b2768e5b2 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:40.720532787 +0000 UTC m=+49.685313261 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls") pod "dns-default-pfv5k" (UID: "8016a568-6fe9-4dfc-a543-f50b2768e5b2") : secret "dns-default-metrics-tls" not found Apr 16 16:23:32.720789 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:32.720563 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert podName:461b689e-a41b-4182-ba52-e26a1dfbc007 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:40.720557179 +0000 UTC m=+49.685337653 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert") pod "ingress-canary-z5t69" (UID: "461b689e-a41b-4182-ba52-e26a1dfbc007") : secret "canary-serving-cert" not found Apr 16 16:23:32.821163 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:32.821127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:32.824404 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:32.824380 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/23450ee2-cf03-4966-b11a-bec44507a72d-original-pull-secret\") pod \"global-pull-secret-syncer-zh4s9\" (UID: \"23450ee2-cf03-4966-b11a-bec44507a72d\") " pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:32.966129 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:32.966086 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zh4s9" Apr 16 16:23:33.142185 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:33.142154 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zh4s9"] Apr 16 16:23:33.145435 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:23:33.145407 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23450ee2_cf03_4966_b11a_bec44507a72d.slice/crio-0ac0037638f1d90ed1e898fef39fb9698073c0dc0a8eced8a5eab4d7841a0a80 WatchSource:0}: Error finding container 0ac0037638f1d90ed1e898fef39fb9698073c0dc0a8eced8a5eab4d7841a0a80: Status 404 returned error can't find the container with id 0ac0037638f1d90ed1e898fef39fb9698073c0dc0a8eced8a5eab4d7841a0a80 Apr 16 16:23:33.826460 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:33.826412 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zh4s9" event={"ID":"23450ee2-cf03-4966-b11a-bec44507a72d","Type":"ContainerStarted","Data":"0ac0037638f1d90ed1e898fef39fb9698073c0dc0a8eced8a5eab4d7841a0a80"} Apr 16 16:23:37.835329 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:37.835293 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zh4s9" event={"ID":"23450ee2-cf03-4966-b11a-bec44507a72d","Type":"ContainerStarted","Data":"a41774656e6c1c64f039c2f754de501c801689003374f16829a6e722818d6579"} Apr 16 16:23:40.778324 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:40.778290 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:40.778787 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:40.778344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:40.778787 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:40.778460 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:40.778787 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:40.778470 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:40.778787 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:40.778514 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls podName:8016a568-6fe9-4dfc-a543-f50b2768e5b2 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:56.778500383 +0000 UTC m=+65.743280857 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls") pod "dns-default-pfv5k" (UID: "8016a568-6fe9-4dfc-a543-f50b2768e5b2") : secret "dns-default-metrics-tls" not found Apr 16 16:23:40.778787 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:40.778539 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert podName:461b689e-a41b-4182-ba52-e26a1dfbc007 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:56.778521269 +0000 UTC m=+65.743301756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert") pod "ingress-canary-z5t69" (UID: "461b689e-a41b-4182-ba52-e26a1dfbc007") : secret "canary-serving-cert" not found Apr 16 16:23:50.805162 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:50.805132 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hschh" Apr 16 16:23:50.837737 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:50.837691 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-zh4s9" podStartSLOduration=29.370656278 podStartE2EDuration="33.8376761s" podCreationTimestamp="2026-04-16 16:23:17 +0000 UTC" firstStartedPulling="2026-04-16 16:23:33.147059516 +0000 UTC m=+42.111839991" lastFinishedPulling="2026-04-16 16:23:37.614079339 +0000 UTC m=+46.578859813" observedRunningTime="2026-04-16 16:23:37.851842814 +0000 UTC m=+46.816623312" watchObservedRunningTime="2026-04-16 16:23:50.8376761 +0000 UTC m=+59.802456595" Apr 16 16:23:56.790699 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:56.790655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:23:56.791154 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:56.790716 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:23:56.791154 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:56.790796 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:56.791154 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:56.790800 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:56.791154 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:56.790846 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert podName:461b689e-a41b-4182-ba52-e26a1dfbc007 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:28.790833188 +0000 UTC m=+97.755613662 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert") pod "ingress-canary-z5t69" (UID: "461b689e-a41b-4182-ba52-e26a1dfbc007") : secret "canary-serving-cert" not found Apr 16 16:23:56.791154 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:56.790859 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls podName:8016a568-6fe9-4dfc-a543-f50b2768e5b2 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:28.790852932 +0000 UTC m=+97.755633406 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls") pod "dns-default-pfv5k" (UID: "8016a568-6fe9-4dfc-a543-f50b2768e5b2") : secret "dns-default-metrics-tls" not found Apr 16 16:23:57.294704 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:57.294660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:23:57.297260 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:57.297240 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:23:57.305214 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:57.305193 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:23:57.305270 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:23:57.305255 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs podName:858151a3-bcef-4b9a-94c3-32bd1f0db177 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:01.30523888 +0000 UTC m=+130.270019354 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs") pod "network-metrics-daemon-sdrp4" (UID: "858151a3-bcef-4b9a-94c3-32bd1f0db177") : secret "metrics-daemon-secret" not found Apr 16 16:23:57.395671 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:57.395632 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clhqw\" (UniqueName: \"kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw\") pod \"network-check-target-jpkws\" (UID: \"d0d1cd03-838f-49df-b77e-5eb6e1a96deb\") " pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:57.398326 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:57.398304 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:23:57.408738 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:57.408715 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:23:57.419113 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:57.419091 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clhqw\" (UniqueName: \"kubernetes.io/projected/d0d1cd03-838f-49df-b77e-5eb6e1a96deb-kube-api-access-clhqw\") pod \"network-check-target-jpkws\" (UID: \"d0d1cd03-838f-49df-b77e-5eb6e1a96deb\") " pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:57.560510 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:57.560422 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wtfd7\"" Apr 16 16:23:57.568575 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:57.568551 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:23:57.713896 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:57.713863 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jpkws"] Apr 16 16:23:57.723659 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:23:57.723624 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0d1cd03_838f_49df_b77e_5eb6e1a96deb.slice/crio-bb38879718055b98ae5562f9a562c104de413ec33ff61449ac99224f35999270 WatchSource:0}: Error finding container bb38879718055b98ae5562f9a562c104de413ec33ff61449ac99224f35999270: Status 404 returned error can't find the container with id bb38879718055b98ae5562f9a562c104de413ec33ff61449ac99224f35999270 Apr 16 16:23:57.872159 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:23:57.872072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jpkws" event={"ID":"d0d1cd03-838f-49df-b77e-5eb6e1a96deb","Type":"ContainerStarted","Data":"bb38879718055b98ae5562f9a562c104de413ec33ff61449ac99224f35999270"} Apr 16 16:24:00.879987 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:00.879951 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jpkws" event={"ID":"d0d1cd03-838f-49df-b77e-5eb6e1a96deb","Type":"ContainerStarted","Data":"bf335b39ab771367f8473afbf553acb1d8a60d3e286611c049947e62f0ed2271"} Apr 16 16:24:00.880394 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:00.880206 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:24:00.897164 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:00.897112 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jpkws" podStartSLOduration=67.274166566 podStartE2EDuration="1m9.897096482s" podCreationTimestamp="2026-04-16 16:22:51 +0000 UTC" firstStartedPulling="2026-04-16 16:23:57.725676947 +0000 UTC m=+66.690457425" lastFinishedPulling="2026-04-16 16:24:00.348606857 +0000 UTC m=+69.313387341" observedRunningTime="2026-04-16 16:24:00.89607329 +0000 UTC m=+69.860853785" watchObservedRunningTime="2026-04-16 16:24:00.897096482 +0000 UTC m=+69.861876978" Apr 16 16:24:28.809417 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:28.809381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:24:28.809851 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:28.809483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:24:28.809851 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:28.809561 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:24:28.809851 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:28.809563 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:24:28.809851 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:28.809621 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert podName:461b689e-a41b-4182-ba52-e26a1dfbc007 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:32.809607962 +0000 UTC m=+161.774388436 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert") pod "ingress-canary-z5t69" (UID: "461b689e-a41b-4182-ba52-e26a1dfbc007") : secret "canary-serving-cert" not found Apr 16 16:24:28.809851 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:28.809646 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls podName:8016a568-6fe9-4dfc-a543-f50b2768e5b2 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:32.809630701 +0000 UTC m=+161.774411175 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls") pod "dns-default-pfv5k" (UID: "8016a568-6fe9-4dfc-a543-f50b2768e5b2") : secret "dns-default-metrics-tls" not found Apr 16 16:24:31.885028 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:31.884993 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jpkws" Apr 16 16:24:36.664523 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.664491 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78"] Apr 16 16:24:36.668786 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.668769 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" Apr 16 16:24:36.671463 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.671423 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 16:24:36.671684 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.671668 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 16:24:36.672151 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.672135 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:36.672474 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.672432 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:36.672940 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.672926 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-dwtkg\"" Apr 16 16:24:36.678464 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.678417 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78"] Apr 16 16:24:36.762920 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.762882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wqkl\" (UniqueName: \"kubernetes.io/projected/228b5774-6748-4592-bb81-0b7f69e4dcc8-kube-api-access-9wqkl\") pod \"kube-storage-version-migrator-operator-756bb7d76f-txg78\" (UID: \"228b5774-6748-4592-bb81-0b7f69e4dcc8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" Apr 16 16:24:36.762920 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.762926 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228b5774-6748-4592-bb81-0b7f69e4dcc8-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-txg78\" (UID: \"228b5774-6748-4592-bb81-0b7f69e4dcc8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" Apr 16 16:24:36.763123 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.762956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/228b5774-6748-4592-bb81-0b7f69e4dcc8-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-txg78\" (UID: \"228b5774-6748-4592-bb81-0b7f69e4dcc8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" Apr 16 16:24:36.864196 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.864152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/228b5774-6748-4592-bb81-0b7f69e4dcc8-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-txg78\" (UID: \"228b5774-6748-4592-bb81-0b7f69e4dcc8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" Apr 16 16:24:36.864372 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.864244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wqkl\" (UniqueName: \"kubernetes.io/projected/228b5774-6748-4592-bb81-0b7f69e4dcc8-kube-api-access-9wqkl\") pod \"kube-storage-version-migrator-operator-756bb7d76f-txg78\" (UID: \"228b5774-6748-4592-bb81-0b7f69e4dcc8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" Apr 16 16:24:36.864372 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.864268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228b5774-6748-4592-bb81-0b7f69e4dcc8-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-txg78\" (UID: \"228b5774-6748-4592-bb81-0b7f69e4dcc8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" Apr 16 16:24:36.864777 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.864761 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228b5774-6748-4592-bb81-0b7f69e4dcc8-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-txg78\" (UID: \"228b5774-6748-4592-bb81-0b7f69e4dcc8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" Apr 16 16:24:36.866857 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.866829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/228b5774-6748-4592-bb81-0b7f69e4dcc8-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-txg78\" (UID: \"228b5774-6748-4592-bb81-0b7f69e4dcc8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" Apr 16 16:24:36.870853 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.870829 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-v9g5q"] Apr 16 16:24:36.873772 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.873756 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-v9g5q" Apr 16 16:24:36.876754 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.876738 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-r7wg7\"" Apr 16 16:24:36.877865 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.877845 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97"] Apr 16 16:24:36.880519 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.880501 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:36.883795 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.883776 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-777984ddb8-n9rkz"] Apr 16 16:24:36.885210 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.885194 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 16:24:36.885949 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.885934 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:36.886400 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.886387 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-tq7gq\"" Apr 16 16:24:36.886557 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.886543 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:36.889131 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.889107 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wqkl\" (UniqueName: \"kubernetes.io/projected/228b5774-6748-4592-bb81-0b7f69e4dcc8-kube-api-access-9wqkl\") pod \"kube-storage-version-migrator-operator-756bb7d76f-txg78\" (UID: \"228b5774-6748-4592-bb81-0b7f69e4dcc8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" Apr 16 16:24:36.890298 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.890282 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:36.891094 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.891077 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:24:36.891195 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.891102 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dwggv\"" Apr 16 16:24:36.891395 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.891381 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:24:36.891550 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.891536 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:24:36.895862 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.895839 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-v9g5q"] Apr 16 16:24:36.897150 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.897133 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:24:36.910559 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.910526 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-777984ddb8-n9rkz"] Apr 16 16:24:36.911196 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.911171 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97"] Apr 16 16:24:36.964807 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.964717 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33bea546-d2f7-4497-87b9-43156b40e189-ca-trust-extracted\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:36.964807 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.964760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7fwl\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-kube-api-access-m7fwl\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:36.964807 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.964790 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:36.965011 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.964821 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-installation-pull-secrets\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:36.965011 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.964848 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mccc\" (UniqueName: \"kubernetes.io/projected/bd99d61b-4caf-4ac1-b662-7679a76998e7-kube-api-access-9mccc\") pod \"network-check-source-7b678d77c7-v9g5q\" (UID: \"bd99d61b-4caf-4ac1-b662-7679a76998e7\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-v9g5q" Apr 16 16:24:36.965011 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.964873 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-trusted-ca\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:36.965011 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.964907 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hfrb\" (UniqueName: \"kubernetes.io/projected/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-kube-api-access-9hfrb\") pod \"cluster-samples-operator-667775844f-wfq97\" (UID: \"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:36.965011 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.964958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-wfq97\" (UID: \"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:36.965168 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.965019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-image-registry-private-configuration\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:36.965168 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.965040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-registry-certificates\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:36.965168 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.965058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-bound-sa-token\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:36.975183 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.975147 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl"] Apr 16 16:24:36.978038 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.978018 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" Apr 16 16:24:36.978038 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.978031 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" Apr 16 16:24:36.980381 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.980360 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:36.980712 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.980695 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-pkgvd\"" Apr 16 16:24:36.982871 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.982853 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 16:24:36.983163 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.982947 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 16:24:36.983834 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.983816 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:36.991720 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:36.991685 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl"] Apr 16 16:24:37.066203 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-image-registry-private-configuration\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.066203 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-registry-certificates\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.066430 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-bound-sa-token\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.066430 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df5ba034-578f-423e-919b-afdf8297d467-config\") pod \"service-ca-operator-69965bb79d-f44hl\" (UID: \"df5ba034-578f-423e-919b-afdf8297d467\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" Apr 16 16:24:37.066430 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33bea546-d2f7-4497-87b9-43156b40e189-ca-trust-extracted\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.066430 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7fwl\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-kube-api-access-m7fwl\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.066430 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.066430 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-installation-pull-secrets\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.066764 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mccc\" (UniqueName: \"kubernetes.io/projected/bd99d61b-4caf-4ac1-b662-7679a76998e7-kube-api-access-9mccc\") pod \"network-check-source-7b678d77c7-v9g5q\" (UID: \"bd99d61b-4caf-4ac1-b662-7679a76998e7\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-v9g5q" Apr 16 16:24:37.066764 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxgk\" (UniqueName: \"kubernetes.io/projected/df5ba034-578f-423e-919b-afdf8297d467-kube-api-access-rdxgk\") pod \"service-ca-operator-69965bb79d-f44hl\" (UID: \"df5ba034-578f-423e-919b-afdf8297d467\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" Apr 16 16:24:37.066764 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-trusted-ca\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.066764 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hfrb\" (UniqueName: \"kubernetes.io/projected/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-kube-api-access-9hfrb\") pod \"cluster-samples-operator-667775844f-wfq97\" (UID: \"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:37.066764 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-wfq97\" (UID: \"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:37.066764 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066620 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df5ba034-578f-423e-919b-afdf8297d467-serving-cert\") pod \"service-ca-operator-69965bb79d-f44hl\" (UID: \"df5ba034-578f-423e-919b-afdf8297d467\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" Apr 16 16:24:37.067056 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.066923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-registry-certificates\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.067229 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:37.067190 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:24:37.067229 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:37.067212 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-777984ddb8-n9rkz: secret "image-registry-tls" not found Apr 16 16:24:37.067229 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:37.067201 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:24:37.067406 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:37.067280 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls podName:33bea546-d2f7-4497-87b9-43156b40e189 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:37.567258046 +0000 UTC m=+106.532038529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls") pod "image-registry-777984ddb8-n9rkz" (UID: "33bea546-d2f7-4497-87b9-43156b40e189") : secret "image-registry-tls" not found Apr 16 16:24:37.067406 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:37.067298 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls podName:feb4994b-9816-4c4e-aee5-9a92bfc3f1cf nodeName:}" failed. No retries permitted until 2026-04-16 16:24:37.567289394 +0000 UTC m=+106.532069869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls") pod "cluster-samples-operator-667775844f-wfq97" (UID: "feb4994b-9816-4c4e-aee5-9a92bfc3f1cf") : secret "samples-operator-tls" not found Apr 16 16:24:37.067562 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.067546 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33bea546-d2f7-4497-87b9-43156b40e189-ca-trust-extracted\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.068263 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.068244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-trusted-ca\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.069417 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.069396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-image-registry-private-configuration\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.069554 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.069518 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-installation-pull-secrets\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.080228 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.080173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-bound-sa-token\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.083033 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.083005 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mccc\" (UniqueName: \"kubernetes.io/projected/bd99d61b-4caf-4ac1-b662-7679a76998e7-kube-api-access-9mccc\") pod \"network-check-source-7b678d77c7-v9g5q\" (UID: \"bd99d61b-4caf-4ac1-b662-7679a76998e7\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-v9g5q" Apr 16 16:24:37.083162 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.083042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hfrb\" (UniqueName: \"kubernetes.io/projected/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-kube-api-access-9hfrb\") pod \"cluster-samples-operator-667775844f-wfq97\" (UID: \"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:37.083355 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.083299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7fwl\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-kube-api-access-m7fwl\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.104071 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.104036 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78"] Apr 16 16:24:37.108908 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:24:37.108668 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod228b5774_6748_4592_bb81_0b7f69e4dcc8.slice/crio-2df9dfd9af1f0e7b5ed6145875325e12d57631577136743e6f380e583c730972 WatchSource:0}: Error finding container 2df9dfd9af1f0e7b5ed6145875325e12d57631577136743e6f380e583c730972: Status 404 returned error can't find the container with id 2df9dfd9af1f0e7b5ed6145875325e12d57631577136743e6f380e583c730972 Apr 16 16:24:37.167772 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.167733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df5ba034-578f-423e-919b-afdf8297d467-config\") pod \"service-ca-operator-69965bb79d-f44hl\" (UID: \"df5ba034-578f-423e-919b-afdf8297d467\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" Apr 16 16:24:37.167962 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.167796 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdxgk\" (UniqueName: \"kubernetes.io/projected/df5ba034-578f-423e-919b-afdf8297d467-kube-api-access-rdxgk\") pod \"service-ca-operator-69965bb79d-f44hl\" (UID: \"df5ba034-578f-423e-919b-afdf8297d467\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" Apr 16 16:24:37.167962 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.167832 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df5ba034-578f-423e-919b-afdf8297d467-serving-cert\") pod \"service-ca-operator-69965bb79d-f44hl\" (UID: \"df5ba034-578f-423e-919b-afdf8297d467\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" Apr 16 16:24:37.168316 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.168295 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df5ba034-578f-423e-919b-afdf8297d467-config\") pod \"service-ca-operator-69965bb79d-f44hl\" (UID: \"df5ba034-578f-423e-919b-afdf8297d467\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" Apr 16 16:24:37.170140 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.170121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df5ba034-578f-423e-919b-afdf8297d467-serving-cert\") pod \"service-ca-operator-69965bb79d-f44hl\" (UID: \"df5ba034-578f-423e-919b-afdf8297d467\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" Apr 16 16:24:37.176782 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.176756 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdxgk\" (UniqueName: \"kubernetes.io/projected/df5ba034-578f-423e-919b-afdf8297d467-kube-api-access-rdxgk\") pod \"service-ca-operator-69965bb79d-f44hl\" (UID: \"df5ba034-578f-423e-919b-afdf8297d467\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" Apr 16 16:24:37.182605 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.182581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-v9g5q" Apr 16 16:24:37.296952 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.296917 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" Apr 16 16:24:37.302050 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.302019 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-v9g5q"] Apr 16 16:24:37.305290 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:24:37.305249 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd99d61b_4caf_4ac1_b662_7679a76998e7.slice/crio-15ef11c69ef7195ff6856232a0ab3ff5980ad474d125e86872aca0073d08c550 WatchSource:0}: Error finding container 15ef11c69ef7195ff6856232a0ab3ff5980ad474d125e86872aca0073d08c550: Status 404 returned error can't find the container with id 15ef11c69ef7195ff6856232a0ab3ff5980ad474d125e86872aca0073d08c550 Apr 16 16:24:37.423574 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.423493 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl"] Apr 16 16:24:37.425873 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:24:37.425842 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf5ba034_578f_423e_919b_afdf8297d467.slice/crio-4ecf3178f3404918b124cb17703ab79d78d1466c9a6e0ac1186e04469b76b313 WatchSource:0}: Error finding container 4ecf3178f3404918b124cb17703ab79d78d1466c9a6e0ac1186e04469b76b313: Status 404 returned error can't find the container with id 4ecf3178f3404918b124cb17703ab79d78d1466c9a6e0ac1186e04469b76b313 Apr 16 16:24:37.571540 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.571424 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:37.571708 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.571589 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-wfq97\" (UID: \"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:37.571708 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:37.571593 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:24:37.571708 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:37.571675 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-777984ddb8-n9rkz: secret "image-registry-tls" not found Apr 16 16:24:37.571708 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:37.571656 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:24:37.571857 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:37.571729 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls podName:33bea546-d2f7-4497-87b9-43156b40e189 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:38.571710406 +0000 UTC m=+107.536490881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls") pod "image-registry-777984ddb8-n9rkz" (UID: "33bea546-d2f7-4497-87b9-43156b40e189") : secret "image-registry-tls" not found Apr 16 16:24:37.571857 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:37.571751 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls podName:feb4994b-9816-4c4e-aee5-9a92bfc3f1cf nodeName:}" failed. No retries permitted until 2026-04-16 16:24:38.571737023 +0000 UTC m=+107.536517508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls") pod "cluster-samples-operator-667775844f-wfq97" (UID: "feb4994b-9816-4c4e-aee5-9a92bfc3f1cf") : secret "samples-operator-tls" not found Apr 16 16:24:37.953097 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.953056 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" event={"ID":"228b5774-6748-4592-bb81-0b7f69e4dcc8","Type":"ContainerStarted","Data":"2df9dfd9af1f0e7b5ed6145875325e12d57631577136743e6f380e583c730972"} Apr 16 16:24:37.954345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.954314 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" event={"ID":"df5ba034-578f-423e-919b-afdf8297d467","Type":"ContainerStarted","Data":"4ecf3178f3404918b124cb17703ab79d78d1466c9a6e0ac1186e04469b76b313"} Apr 16 16:24:37.955716 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.955686 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-v9g5q" event={"ID":"bd99d61b-4caf-4ac1-b662-7679a76998e7","Type":"ContainerStarted","Data":"428d44fc5422cfb604a474c052ee2c2d57caf58274f2d3180c39e115934949a0"} Apr 16 16:24:37.955837 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.955720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-v9g5q" event={"ID":"bd99d61b-4caf-4ac1-b662-7679a76998e7","Type":"ContainerStarted","Data":"15ef11c69ef7195ff6856232a0ab3ff5980ad474d125e86872aca0073d08c550"} Apr 16 16:24:37.972644 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:37.972584 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-v9g5q" podStartSLOduration=1.9725655560000002 podStartE2EDuration="1.972565556s" podCreationTimestamp="2026-04-16 16:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:37.971213942 +0000 UTC m=+106.935994438" watchObservedRunningTime="2026-04-16 16:24:37.972565556 +0000 UTC m=+106.937346052" Apr 16 16:24:38.580882 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:38.580527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:38.580882 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:38.580620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-wfq97\" (UID: \"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:38.580882 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:38.580799 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:24:38.580882 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:38.580831 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:24:38.580882 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:38.580846 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-777984ddb8-n9rkz: secret "image-registry-tls" not found Apr 16 16:24:38.581249 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:38.580911 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls podName:feb4994b-9816-4c4e-aee5-9a92bfc3f1cf nodeName:}" failed. No retries permitted until 2026-04-16 16:24:40.580843673 +0000 UTC m=+109.545624171 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls") pod "cluster-samples-operator-667775844f-wfq97" (UID: "feb4994b-9816-4c4e-aee5-9a92bfc3f1cf") : secret "samples-operator-tls" not found Apr 16 16:24:38.581249 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:38.580941 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls podName:33bea546-d2f7-4497-87b9-43156b40e189 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:40.580919368 +0000 UTC m=+109.545699846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls") pod "image-registry-777984ddb8-n9rkz" (UID: "33bea546-d2f7-4497-87b9-43156b40e189") : secret "image-registry-tls" not found Apr 16 16:24:39.962719 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:39.962677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" event={"ID":"228b5774-6748-4592-bb81-0b7f69e4dcc8","Type":"ContainerStarted","Data":"791f2d762edf2afefc052a416b911d216db4376dfd488cc7f50d22e40493fa32"} Apr 16 16:24:39.963950 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:39.963923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" event={"ID":"df5ba034-578f-423e-919b-afdf8297d467","Type":"ContainerStarted","Data":"4023430d0847a834a9e447a23ef7067986767487a2ef135fbf96950ad656b253"} Apr 16 16:24:39.981331 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:39.981271 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" podStartSLOduration=1.520646559 podStartE2EDuration="3.981254718s" podCreationTimestamp="2026-04-16 16:24:36 +0000 UTC" firstStartedPulling="2026-04-16 16:24:37.110373655 +0000 UTC m=+106.075154128" lastFinishedPulling="2026-04-16 16:24:39.570981801 +0000 UTC m=+108.535762287" observedRunningTime="2026-04-16 16:24:39.979709189 +0000 UTC m=+108.944489686" watchObservedRunningTime="2026-04-16 16:24:39.981254718 +0000 UTC m=+108.946035216" Apr 16 16:24:39.995244 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:39.995023 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" podStartSLOduration=1.84910619 podStartE2EDuration="3.995006227s" podCreationTimestamp="2026-04-16 16:24:36 +0000 UTC" firstStartedPulling="2026-04-16 16:24:37.427756542 +0000 UTC m=+106.392537022" lastFinishedPulling="2026-04-16 16:24:39.573656571 +0000 UTC m=+108.538437059" observedRunningTime="2026-04-16 16:24:39.994902302 +0000 UTC m=+108.959682800" watchObservedRunningTime="2026-04-16 16:24:39.995006227 +0000 UTC m=+108.959786722" Apr 16 16:24:40.597367 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:40.597328 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:40.597558 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:40.597400 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-wfq97\" (UID: \"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:40.597558 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:40.597509 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:24:40.597558 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:40.597526 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-777984ddb8-n9rkz: secret "image-registry-tls" not found Apr 16 16:24:40.597669 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:40.597557 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:24:40.597669 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:40.597587 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls podName:33bea546-d2f7-4497-87b9-43156b40e189 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:44.597569312 +0000 UTC m=+113.562349792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls") pod "image-registry-777984ddb8-n9rkz" (UID: "33bea546-d2f7-4497-87b9-43156b40e189") : secret "image-registry-tls" not found Apr 16 16:24:40.597669 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:40.597612 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls podName:feb4994b-9816-4c4e-aee5-9a92bfc3f1cf nodeName:}" failed. No retries permitted until 2026-04-16 16:24:44.597598611 +0000 UTC m=+113.562379085 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls") pod "cluster-samples-operator-667775844f-wfq97" (UID: "feb4994b-9816-4c4e-aee5-9a92bfc3f1cf") : secret "samples-operator-tls" not found Apr 16 16:24:42.759682 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.759649 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-9rl88"] Apr 16 16:24:42.763787 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.763771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" Apr 16 16:24:42.766655 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.766625 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 16:24:42.766960 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.766933 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 16:24:42.767201 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.767183 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 16:24:42.767692 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.767669 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-z9m6p\"" Apr 16 16:24:42.769047 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.767936 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 16:24:42.770242 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.770221 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-9rl88"] Apr 16 16:24:42.815080 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.815043 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c0e7e446-d13d-4a86-93ea-bbd79e26d63b-signing-cabundle\") pod \"service-ca-bfc587fb7-9rl88\" (UID: \"c0e7e446-d13d-4a86-93ea-bbd79e26d63b\") " pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" Apr 16 16:24:42.815080 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.815087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4l4p\" (UniqueName: \"kubernetes.io/projected/c0e7e446-d13d-4a86-93ea-bbd79e26d63b-kube-api-access-n4l4p\") pod \"service-ca-bfc587fb7-9rl88\" (UID: \"c0e7e446-d13d-4a86-93ea-bbd79e26d63b\") " pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" Apr 16 16:24:42.815310 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.815249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c0e7e446-d13d-4a86-93ea-bbd79e26d63b-signing-key\") pod \"service-ca-bfc587fb7-9rl88\" (UID: \"c0e7e446-d13d-4a86-93ea-bbd79e26d63b\") " pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" Apr 16 16:24:42.916309 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.916276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c0e7e446-d13d-4a86-93ea-bbd79e26d63b-signing-key\") pod \"service-ca-bfc587fb7-9rl88\" (UID: \"c0e7e446-d13d-4a86-93ea-bbd79e26d63b\") " pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" Apr 16 16:24:42.916522 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.916344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c0e7e446-d13d-4a86-93ea-bbd79e26d63b-signing-cabundle\") pod \"service-ca-bfc587fb7-9rl88\" (UID: \"c0e7e446-d13d-4a86-93ea-bbd79e26d63b\") " pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" Apr 16 16:24:42.916522 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.916369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4l4p\" (UniqueName: \"kubernetes.io/projected/c0e7e446-d13d-4a86-93ea-bbd79e26d63b-kube-api-access-n4l4p\") pod \"service-ca-bfc587fb7-9rl88\" (UID: \"c0e7e446-d13d-4a86-93ea-bbd79e26d63b\") " pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" Apr 16 16:24:42.917590 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.917566 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c0e7e446-d13d-4a86-93ea-bbd79e26d63b-signing-cabundle\") pod \"service-ca-bfc587fb7-9rl88\" (UID: \"c0e7e446-d13d-4a86-93ea-bbd79e26d63b\") " pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" Apr 16 16:24:42.918678 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.918658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c0e7e446-d13d-4a86-93ea-bbd79e26d63b-signing-key\") pod \"service-ca-bfc587fb7-9rl88\" (UID: \"c0e7e446-d13d-4a86-93ea-bbd79e26d63b\") " pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" Apr 16 16:24:42.927160 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:42.927143 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4l4p\" (UniqueName: \"kubernetes.io/projected/c0e7e446-d13d-4a86-93ea-bbd79e26d63b-kube-api-access-n4l4p\") pod \"service-ca-bfc587fb7-9rl88\" (UID: \"c0e7e446-d13d-4a86-93ea-bbd79e26d63b\") " pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" Apr 16 16:24:43.077206 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:43.077103 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" Apr 16 16:24:43.197724 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:43.197691 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-9rl88"] Apr 16 16:24:43.201018 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:24:43.200983 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0e7e446_d13d_4a86_93ea_bbd79e26d63b.slice/crio-c6ea9d6b0174397afcd6614c721d81f9acc4b90f3abd640e25e70da31993fb9d WatchSource:0}: Error finding container c6ea9d6b0174397afcd6614c721d81f9acc4b90f3abd640e25e70da31993fb9d: Status 404 returned error can't find the container with id c6ea9d6b0174397afcd6614c721d81f9acc4b90f3abd640e25e70da31993fb9d Apr 16 16:24:43.953374 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:43.953347 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7h5k5_937105e9-6cc7-458f-9b5c-007250aa5a6c/dns-node-resolver/0.log" Apr 16 16:24:43.974583 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:43.974553 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" event={"ID":"c0e7e446-d13d-4a86-93ea-bbd79e26d63b","Type":"ContainerStarted","Data":"f47160ecf27a43ee694e5c7102cf10806b6039d87fb72ab1abae0d724865c22f"} Apr 16 16:24:43.974756 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:43.974590 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" event={"ID":"c0e7e446-d13d-4a86-93ea-bbd79e26d63b","Type":"ContainerStarted","Data":"c6ea9d6b0174397afcd6614c721d81f9acc4b90f3abd640e25e70da31993fb9d"} Apr 16 16:24:43.993595 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:43.993540 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-9rl88" podStartSLOduration=1.993524987 podStartE2EDuration="1.993524987s" podCreationTimestamp="2026-04-16 16:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:43.992282203 +0000 UTC m=+112.957062699" watchObservedRunningTime="2026-04-16 16:24:43.993524987 +0000 UTC m=+112.958305483" Apr 16 16:24:44.630569 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:44.630518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-wfq97\" (UID: \"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:44.630755 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:44.630644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:44.630755 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:44.630650 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:24:44.630755 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:44.630721 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls podName:feb4994b-9816-4c4e-aee5-9a92bfc3f1cf nodeName:}" failed. No retries permitted until 2026-04-16 16:24:52.630701118 +0000 UTC m=+121.595481597 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls") pod "cluster-samples-operator-667775844f-wfq97" (UID: "feb4994b-9816-4c4e-aee5-9a92bfc3f1cf") : secret "samples-operator-tls" not found Apr 16 16:24:44.630942 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:44.630766 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:24:44.630942 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:44.630779 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-777984ddb8-n9rkz: secret "image-registry-tls" not found Apr 16 16:24:44.630942 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:24:44.630843 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls podName:33bea546-d2f7-4497-87b9-43156b40e189 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:52.630827032 +0000 UTC m=+121.595607519 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls") pod "image-registry-777984ddb8-n9rkz" (UID: "33bea546-d2f7-4497-87b9-43156b40e189") : secret "image-registry-tls" not found Apr 16 16:24:45.353057 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:45.353031 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gq7bg_b5a35ec4-25f4-4c5b-8175-23e377d3e9b3/node-ca/0.log" Apr 16 16:24:46.753234 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:46.753198 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-txg78_228b5774-6748-4592-bb81-0b7f69e4dcc8/kube-storage-version-migrator-operator/0.log" Apr 16 16:24:52.696586 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:52.696549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-wfq97\" (UID: \"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:52.696988 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:52.696632 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:52.699035 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:52.699011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feb4994b-9816-4c4e-aee5-9a92bfc3f1cf-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-wfq97\" (UID: \"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:52.699138 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:52.699051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls\") pod \"image-registry-777984ddb8-n9rkz\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:52.799815 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:52.799782 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-tq7gq\"" Apr 16 16:24:52.806526 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:52.806501 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dwggv\"" Apr 16 16:24:52.807997 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:52.807975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" Apr 16 16:24:52.813797 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:52.813776 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:52.949182 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:52.949110 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97"] Apr 16 16:24:52.968518 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:52.968479 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-777984ddb8-n9rkz"] Apr 16 16:24:52.971798 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:24:52.971771 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33bea546_d2f7_4497_87b9_43156b40e189.slice/crio-47d5c90fd1bd207d5c77b850b7a4d88ab42153ded2b7d41a54b345d64d439c14 WatchSource:0}: Error finding container 47d5c90fd1bd207d5c77b850b7a4d88ab42153ded2b7d41a54b345d64d439c14: Status 404 returned error can't find the container with id 47d5c90fd1bd207d5c77b850b7a4d88ab42153ded2b7d41a54b345d64d439c14 Apr 16 16:24:53.001166 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:53.001137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" event={"ID":"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf","Type":"ContainerStarted","Data":"b2a408c099978173c2f5393a18af4a8e25a462ce5512ade3031f530c807714ae"} Apr 16 16:24:53.002424 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:53.002397 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" event={"ID":"33bea546-d2f7-4497-87b9-43156b40e189","Type":"ContainerStarted","Data":"47d5c90fd1bd207d5c77b850b7a4d88ab42153ded2b7d41a54b345d64d439c14"} Apr 16 16:24:54.007487 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:54.007422 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" event={"ID":"33bea546-d2f7-4497-87b9-43156b40e189","Type":"ContainerStarted","Data":"3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54"} Apr 16 16:24:54.007948 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:54.007569 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:24:54.028894 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:54.028839 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" podStartSLOduration=18.028823754 podStartE2EDuration="18.028823754s" podCreationTimestamp="2026-04-16 16:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:54.028735875 +0000 UTC m=+122.993516385" watchObservedRunningTime="2026-04-16 16:24:54.028823754 +0000 UTC m=+122.993604250" Apr 16 16:24:55.010907 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:55.010868 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" event={"ID":"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf","Type":"ContainerStarted","Data":"ef522d34629e5936fb1ba12dd793601a1ef6108ccef29501cfdb4a5dc7562fb5"} Apr 16 16:24:55.011399 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:55.010913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" event={"ID":"feb4994b-9816-4c4e-aee5-9a92bfc3f1cf","Type":"ContainerStarted","Data":"006a0433656639ec53a383b8c0d6ea44158526825bb29b037acc4f024c8e3a88"} Apr 16 16:24:55.035140 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:24:55.035091 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wfq97" podStartSLOduration=17.376266355 podStartE2EDuration="19.03507619s" podCreationTimestamp="2026-04-16 16:24:36 +0000 UTC" firstStartedPulling="2026-04-16 16:24:52.99246413 +0000 UTC m=+121.957244622" lastFinishedPulling="2026-04-16 16:24:54.651273979 +0000 UTC m=+123.616054457" observedRunningTime="2026-04-16 16:24:55.034074658 +0000 UTC m=+123.998855154" watchObservedRunningTime="2026-04-16 16:24:55.03507619 +0000 UTC m=+123.999856686" Apr 16 16:25:01.363231 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:01.363195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:25:01.365609 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:01.365586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/858151a3-bcef-4b9a-94c3-32bd1f0db177-metrics-certs\") pod \"network-metrics-daemon-sdrp4\" (UID: \"858151a3-bcef-4b9a-94c3-32bd1f0db177\") " pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:25:01.471658 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:01.471625 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pm5r6\"" Apr 16 16:25:01.479938 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:01.479905 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sdrp4" Apr 16 16:25:01.597928 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:01.597775 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sdrp4"] Apr 16 16:25:01.600842 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:25:01.600813 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod858151a3_bcef_4b9a_94c3_32bd1f0db177.slice/crio-7422d7281f354be1c8e5520d60a1952543568ddbeede7c52b2d58ed93d2ca36c WatchSource:0}: Error finding container 7422d7281f354be1c8e5520d60a1952543568ddbeede7c52b2d58ed93d2ca36c: Status 404 returned error can't find the container with id 7422d7281f354be1c8e5520d60a1952543568ddbeede7c52b2d58ed93d2ca36c Apr 16 16:25:02.030242 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:02.030207 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sdrp4" event={"ID":"858151a3-bcef-4b9a-94c3-32bd1f0db177","Type":"ContainerStarted","Data":"7422d7281f354be1c8e5520d60a1952543568ddbeede7c52b2d58ed93d2ca36c"} Apr 16 16:25:02.964469 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:02.964418 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wfnkm"] Apr 16 16:25:02.968079 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:02.968056 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:02.970899 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:02.970721 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:25:02.972180 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:02.971967 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:25:02.972180 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:02.971991 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:25:02.972180 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:02.972027 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-t6bvr\"" Apr 16 16:25:02.972180 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:02.972156 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:25:02.987875 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:02.987855 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wfnkm"] Apr 16 16:25:03.029168 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.029146 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-777984ddb8-n9rkz"] Apr 16 16:25:03.029956 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.029938 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj"] Apr 16 16:25:03.032723 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.032700 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj" Apr 16 16:25:03.035195 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.035172 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sdrp4" event={"ID":"858151a3-bcef-4b9a-94c3-32bd1f0db177","Type":"ContainerStarted","Data":"57b9d18bd18a8b2b9259624ec5dd0668b9c2fbb24814821a2d3afa1f4b9c5923"} Apr 16 16:25:03.035497 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.035477 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-nfmqd\"" Apr 16 16:25:03.036923 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.036877 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 16:25:03.051627 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.051574 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj"] Apr 16 16:25:03.077229 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.077100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p9g7\" (UniqueName: \"kubernetes.io/projected/d8cd52ef-667c-4000-b683-c3c39c1df67e-kube-api-access-5p9g7\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.077229 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.077177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d8cd52ef-667c-4000-b683-c3c39c1df67e-data-volume\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.077492 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.077273 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d8cd52ef-667c-4000-b683-c3c39c1df67e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.077492 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.077324 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d8cd52ef-667c-4000-b683-c3c39c1df67e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.077492 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.077393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d8cd52ef-667c-4000-b683-c3c39c1df67e-crio-socket\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.095169 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.095137 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7c6b454dd-p6wm8"] Apr 16 16:25:03.099272 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.099252 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.116055 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.116029 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c6b454dd-p6wm8"] Apr 16 16:25:03.178774 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.178699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p9g7\" (UniqueName: \"kubernetes.io/projected/d8cd52ef-667c-4000-b683-c3c39c1df67e-kube-api-access-5p9g7\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.178774 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.178751 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d8cd52ef-667c-4000-b683-c3c39c1df67e-data-volume\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.178963 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.178781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d8cd52ef-667c-4000-b683-c3c39c1df67e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.178963 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.178806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-ca-trust-extracted\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.178963 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.178822 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-trusted-ca\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.178963 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.178846 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-registry-tls\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.179149 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.178964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d8cd52ef-667c-4000-b683-c3c39c1df67e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.179149 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.179026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-registry-certificates\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.179149 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.179051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-image-registry-private-configuration\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.179149 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.179080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-installation-pull-secrets\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.179149 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.179132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-bound-sa-token\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.179420 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.179193 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d8cd52ef-667c-4000-b683-c3c39c1df67e-crio-socket\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.179420 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.179228 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/90402a41-1582-462e-a0fc-4ffd6b779e4b-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-c4dfj\" (UID: \"90402a41-1582-462e-a0fc-4ffd6b779e4b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj" Apr 16 16:25:03.179420 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.179271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d8cd52ef-667c-4000-b683-c3c39c1df67e-crio-socket\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.179420 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.179280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkjnc\" (UniqueName: \"kubernetes.io/projected/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-kube-api-access-qkjnc\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.179420 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.179338 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d8cd52ef-667c-4000-b683-c3c39c1df67e-data-volume\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.180182 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.180160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d8cd52ef-667c-4000-b683-c3c39c1df67e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.181212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.181197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d8cd52ef-667c-4000-b683-c3c39c1df67e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.206720 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.206691 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p9g7\" (UniqueName: \"kubernetes.io/projected/d8cd52ef-667c-4000-b683-c3c39c1df67e-kube-api-access-5p9g7\") pod \"insights-runtime-extractor-wfnkm\" (UID: \"d8cd52ef-667c-4000-b683-c3c39c1df67e\") " pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.279566 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.279529 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wfnkm" Apr 16 16:25:03.279736 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.279669 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-ca-trust-extracted\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.279736 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.279699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-trusted-ca\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.279736 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.279725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-registry-tls\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.279862 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.279764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-registry-certificates\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.279862 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.279800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-image-registry-private-configuration\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.279862 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.279828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-installation-pull-secrets\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.280006 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.279991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-bound-sa-token\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.280052 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.280025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/90402a41-1582-462e-a0fc-4ffd6b779e4b-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-c4dfj\" (UID: \"90402a41-1582-462e-a0fc-4ffd6b779e4b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj" Apr 16 16:25:03.280093 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.280069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkjnc\" (UniqueName: \"kubernetes.io/projected/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-kube-api-access-qkjnc\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.280138 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.280084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-ca-trust-extracted\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.281260 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.281209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-registry-certificates\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.281432 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.281400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-trusted-ca\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.282932 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.282907 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-installation-pull-secrets\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.283050 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.283020 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-registry-tls\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.283050 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.283034 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/90402a41-1582-462e-a0fc-4ffd6b779e4b-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-c4dfj\" (UID: \"90402a41-1582-462e-a0fc-4ffd6b779e4b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj" Apr 16 16:25:03.283162 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.283142 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-image-registry-private-configuration\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.292682 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.292658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-bound-sa-token\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.294023 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.294002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkjnc\" (UniqueName: \"kubernetes.io/projected/e87aa010-2a1b-4e10-a6a1-5a99c9830e6f-kube-api-access-qkjnc\") pod \"image-registry-7c6b454dd-p6wm8\" (UID: \"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f\") " pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.352324 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.352294 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj" Apr 16 16:25:03.409809 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.409780 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:03.443630 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.443536 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wfnkm"] Apr 16 16:25:03.446513 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:25:03.446477 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8cd52ef_667c_4000_b683_c3c39c1df67e.slice/crio-c3bdcbf9e93386285cccde1a8fd4c5259ec13d52b1a3c4945ef8194cbfa9063d WatchSource:0}: Error finding container c3bdcbf9e93386285cccde1a8fd4c5259ec13d52b1a3c4945ef8194cbfa9063d: Status 404 returned error can't find the container with id c3bdcbf9e93386285cccde1a8fd4c5259ec13d52b1a3c4945ef8194cbfa9063d Apr 16 16:25:03.515380 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.515336 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj"] Apr 16 16:25:03.520033 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:25:03.520004 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90402a41_1582_462e_a0fc_4ffd6b779e4b.slice/crio-f14b0ae1ebb9cc4788f81fabadb0af33b550f99385358c68f0c0981a92a830ca WatchSource:0}: Error finding container f14b0ae1ebb9cc4788f81fabadb0af33b550f99385358c68f0c0981a92a830ca: Status 404 returned error can't find the container with id f14b0ae1ebb9cc4788f81fabadb0af33b550f99385358c68f0c0981a92a830ca Apr 16 16:25:03.569682 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:03.569648 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c6b454dd-p6wm8"] Apr 16 16:25:03.572148 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:25:03.572113 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode87aa010_2a1b_4e10_a6a1_5a99c9830e6f.slice/crio-23801ea27cd84b43103de90088bbcbfe2649c415a721321df04a82d79077df58 WatchSource:0}: Error finding container 23801ea27cd84b43103de90088bbcbfe2649c415a721321df04a82d79077df58: Status 404 returned error can't find the container with id 23801ea27cd84b43103de90088bbcbfe2649c415a721321df04a82d79077df58 Apr 16 16:25:04.038759 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:04.038725 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" event={"ID":"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f","Type":"ContainerStarted","Data":"278838032b98ae732edafd2d95f3412582c250510dfcfec4838882ee0b34a213"} Apr 16 16:25:04.038759 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:04.038762 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" event={"ID":"e87aa010-2a1b-4e10-a6a1-5a99c9830e6f","Type":"ContainerStarted","Data":"23801ea27cd84b43103de90088bbcbfe2649c415a721321df04a82d79077df58"} Apr 16 16:25:04.039236 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:04.038956 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:04.039935 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:04.039915 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wfnkm" event={"ID":"d8cd52ef-667c-4000-b683-c3c39c1df67e","Type":"ContainerStarted","Data":"a4706eef66b41af729b0f21ec68edc9c0e14baa8b3aa92754833f6ffe63fab35"} Apr 16 16:25:04.040010 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:04.039941 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wfnkm" event={"ID":"d8cd52ef-667c-4000-b683-c3c39c1df67e","Type":"ContainerStarted","Data":"c3bdcbf9e93386285cccde1a8fd4c5259ec13d52b1a3c4945ef8194cbfa9063d"} Apr 16 16:25:04.041338 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:04.041277 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sdrp4" event={"ID":"858151a3-bcef-4b9a-94c3-32bd1f0db177","Type":"ContainerStarted","Data":"3fa1c6c8002c781530540143b426819123d70384a5b1b129dcd8503c1f027742"} Apr 16 16:25:04.042311 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:04.042291 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj" event={"ID":"90402a41-1582-462e-a0fc-4ffd6b779e4b","Type":"ContainerStarted","Data":"f14b0ae1ebb9cc4788f81fabadb0af33b550f99385358c68f0c0981a92a830ca"} Apr 16 16:25:04.063003 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:04.062942 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" podStartSLOduration=1.062926111 podStartE2EDuration="1.062926111s" podCreationTimestamp="2026-04-16 16:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:25:04.06141164 +0000 UTC m=+133.026192135" watchObservedRunningTime="2026-04-16 16:25:04.062926111 +0000 UTC m=+133.027706608" Apr 16 16:25:04.082646 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:04.082603 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sdrp4" podStartSLOduration=131.774031916 podStartE2EDuration="2m13.082589064s" podCreationTimestamp="2026-04-16 16:22:51 +0000 UTC" firstStartedPulling="2026-04-16 16:25:01.602572363 +0000 UTC m=+130.567352839" lastFinishedPulling="2026-04-16 16:25:02.911129509 +0000 UTC m=+131.875909987" observedRunningTime="2026-04-16 16:25:04.082053638 +0000 UTC m=+133.046834136" watchObservedRunningTime="2026-04-16 16:25:04.082589064 +0000 UTC m=+133.047369571" Apr 16 16:25:05.048951 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:05.048910 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wfnkm" event={"ID":"d8cd52ef-667c-4000-b683-c3c39c1df67e","Type":"ContainerStarted","Data":"83936f2c5525387cb916638ae2b3a3d303429674215e238542506450b60dfd66"} Apr 16 16:25:06.053072 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:06.053041 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wfnkm" event={"ID":"d8cd52ef-667c-4000-b683-c3c39c1df67e","Type":"ContainerStarted","Data":"030ad79df5d1cf80c1e04f34f31de502a175ed6326cd12f55dc1e2cf54e140b9"} Apr 16 16:25:06.079090 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:06.079037 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wfnkm" podStartSLOduration=1.676319102 podStartE2EDuration="4.079022504s" podCreationTimestamp="2026-04-16 16:25:02 +0000 UTC" firstStartedPulling="2026-04-16 16:25:03.523952293 +0000 UTC m=+132.488732767" lastFinishedPulling="2026-04-16 16:25:05.926655692 +0000 UTC m=+134.891436169" observedRunningTime="2026-04-16 16:25:06.07734808 +0000 UTC m=+135.042128569" watchObservedRunningTime="2026-04-16 16:25:06.079022504 +0000 UTC m=+135.043802997" Apr 16 16:25:07.056670 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:07.056631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj" event={"ID":"90402a41-1582-462e-a0fc-4ffd6b779e4b","Type":"ContainerStarted","Data":"692e870dc5f29836dd574ce16c5f669687c0fdd1e91ab7bfef95fef408dee9fe"} Apr 16 16:25:07.057135 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:07.057118 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj" Apr 16 16:25:07.061424 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:07.061397 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj" Apr 16 16:25:07.073298 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:07.073236 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-c4dfj" podStartSLOduration=1.963292884 podStartE2EDuration="5.073218105s" podCreationTimestamp="2026-04-16 16:25:02 +0000 UTC" firstStartedPulling="2026-04-16 16:25:03.522051777 +0000 UTC m=+132.486832250" lastFinishedPulling="2026-04-16 16:25:06.631976982 +0000 UTC m=+135.596757471" observedRunningTime="2026-04-16 16:25:07.071927785 +0000 UTC m=+136.036708281" watchObservedRunningTime="2026-04-16 16:25:07.073218105 +0000 UTC m=+136.037998605" Apr 16 16:25:08.207242 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.207202 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-849wn"] Apr 16 16:25:08.210749 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.210731 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.215161 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.215133 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 16:25:08.216316 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.216292 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 16:25:08.216316 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.216310 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:25:08.216523 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.216324 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:25:08.216653 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.216638 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:25:08.216736 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.216638 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gk8bd\"" Apr 16 16:25:08.221843 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.221822 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-849wn"] Apr 16 16:25:08.325557 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.325526 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b09cd66f-8167-4706-bf22-d8813a45efde-metrics-client-ca\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.325733 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.325583 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b09cd66f-8167-4706-bf22-d8813a45efde-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.325733 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.325648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b28v\" (UniqueName: \"kubernetes.io/projected/b09cd66f-8167-4706-bf22-d8813a45efde-kube-api-access-2b28v\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.325733 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.325681 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b09cd66f-8167-4706-bf22-d8813a45efde-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.426459 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.426423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b09cd66f-8167-4706-bf22-d8813a45efde-metrics-client-ca\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.426562 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.426493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b09cd66f-8167-4706-bf22-d8813a45efde-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.426562 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.426531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b28v\" (UniqueName: \"kubernetes.io/projected/b09cd66f-8167-4706-bf22-d8813a45efde-kube-api-access-2b28v\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.426650 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.426563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b09cd66f-8167-4706-bf22-d8813a45efde-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.427194 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.427159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b09cd66f-8167-4706-bf22-d8813a45efde-metrics-client-ca\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.428903 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.428880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b09cd66f-8167-4706-bf22-d8813a45efde-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.429006 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.428960 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b09cd66f-8167-4706-bf22-d8813a45efde-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.435837 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.435807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b28v\" (UniqueName: \"kubernetes.io/projected/b09cd66f-8167-4706-bf22-d8813a45efde-kube-api-access-2b28v\") pod \"prometheus-operator-78f957474d-849wn\" (UID: \"b09cd66f-8167-4706-bf22-d8813a45efde\") " pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.519737 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.519630 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" Apr 16 16:25:08.642550 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:08.642513 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-849wn"] Apr 16 16:25:08.646484 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:25:08.646436 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb09cd66f_8167_4706_bf22_d8813a45efde.slice/crio-d82a0eb93c717ad896cf7f3a5ddbc1177c71794ebef03bc2b9c1ab4f2b452f1f WatchSource:0}: Error finding container d82a0eb93c717ad896cf7f3a5ddbc1177c71794ebef03bc2b9c1ab4f2b452f1f: Status 404 returned error can't find the container with id d82a0eb93c717ad896cf7f3a5ddbc1177c71794ebef03bc2b9c1ab4f2b452f1f Apr 16 16:25:09.062529 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:09.062492 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" event={"ID":"b09cd66f-8167-4706-bf22-d8813a45efde","Type":"ContainerStarted","Data":"d82a0eb93c717ad896cf7f3a5ddbc1177c71794ebef03bc2b9c1ab4f2b452f1f"} Apr 16 16:25:10.066813 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:10.066782 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" event={"ID":"b09cd66f-8167-4706-bf22-d8813a45efde","Type":"ContainerStarted","Data":"268caa5acd43f0ddbde76284b2537da950c93207265d747009fbb42153aac9a5"} Apr 16 16:25:10.066813 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:10.066817 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" event={"ID":"b09cd66f-8167-4706-bf22-d8813a45efde","Type":"ContainerStarted","Data":"b5e9157a6213b7872886759b905f0e5900f92dc07fd7124821e370dc3ac331d2"} Apr 16 16:25:10.140197 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:10.140076 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-849wn" podStartSLOduration=0.932333344 podStartE2EDuration="2.140061925s" podCreationTimestamp="2026-04-16 16:25:08 +0000 UTC" firstStartedPulling="2026-04-16 16:25:08.648266344 +0000 UTC m=+137.613046818" lastFinishedPulling="2026-04-16 16:25:09.855994921 +0000 UTC m=+138.820775399" observedRunningTime="2026-04-16 16:25:10.139797983 +0000 UTC m=+139.104578504" watchObservedRunningTime="2026-04-16 16:25:10.140061925 +0000 UTC m=+139.104842467" Apr 16 16:25:11.619057 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.619014 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v"] Apr 16 16:25:11.622857 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.622823 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.625210 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.625173 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5x8g6"] Apr 16 16:25:11.626517 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.626496 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 16:25:11.628724 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.628705 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.629263 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.629238 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:25:11.630871 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.630851 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qlxsl\"" Apr 16 16:25:11.631019 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.631003 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mp6v2\"" Apr 16 16:25:11.640775 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.638503 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:25:11.640775 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.640481 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-sx2kr"] Apr 16 16:25:11.640775 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.640688 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:25:11.643174 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.643154 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:25:11.644321 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.644306 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.646705 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.646677 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 16:25:11.647202 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.647184 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-98pfz\"" Apr 16 16:25:11.647523 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.647505 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:25:11.647591 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.647554 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 16:25:11.698149 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.698112 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-sx2kr"] Apr 16 16:25:11.738151 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.738117 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v"] Apr 16 16:25:11.754890 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.754839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8bf96d8-da41-4d71-80d1-f04a83e90145-root\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.755087 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.754898 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgsbq\" (UniqueName: \"kubernetes.io/projected/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-api-access-bgsbq\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.755087 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.754930 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8bf96d8-da41-4d71-80d1-f04a83e90145-metrics-client-ca\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.755087 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.754958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.755087 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.754986 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71243495-3c08-450a-b8d9-dce03ef8be95-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.755087 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/71243495-3c08-450a-b8d9-dce03ef8be95-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.755087 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755046 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/71243495-3c08-450a-b8d9-dce03ef8be95-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-textfile\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-wtmp\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-accelerators-collector-config\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmpr\" (UniqueName: \"kubernetes.io/projected/d8bf96d8-da41-4d71-80d1-f04a83e90145-kube-api-access-bzmpr\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755226 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5lkf\" (UniqueName: \"kubernetes.io/projected/71243495-3c08-450a-b8d9-dce03ef8be95-kube-api-access-m5lkf\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755315 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8bf96d8-da41-4d71-80d1-f04a83e90145-sys\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755339 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-tls\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.755505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.755430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.855878 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.855840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.856076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.855911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8bf96d8-da41-4d71-80d1-f04a83e90145-root\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.856076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.855951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgsbq\" (UniqueName: \"kubernetes.io/projected/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-api-access-bgsbq\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.856076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8bf96d8-da41-4d71-80d1-f04a83e90145-metrics-client-ca\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.856076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856031 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.856076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856034 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8bf96d8-da41-4d71-80d1-f04a83e90145-root\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.856076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71243495-3c08-450a-b8d9-dce03ef8be95-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.856379 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/71243495-3c08-450a-b8d9-dce03ef8be95-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.856379 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.856379 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/71243495-3c08-450a-b8d9-dce03ef8be95-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.856379 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-textfile\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.856379 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-wtmp\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.856379 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856278 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-accelerators-collector-config\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.856379 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmpr\" (UniqueName: \"kubernetes.io/projected/d8bf96d8-da41-4d71-80d1-f04a83e90145-kube-api-access-bzmpr\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.856379 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.856379 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5lkf\" (UniqueName: \"kubernetes.io/projected/71243495-3c08-450a-b8d9-dce03ef8be95-kube-api-access-m5lkf\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.856379 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.856870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.856870 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:25:11.856410 2577 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 16:25:11.856870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8bf96d8-da41-4d71-80d1-f04a83e90145-sys\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.856870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.856870 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:25:11.856515 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71243495-3c08-450a-b8d9-dce03ef8be95-openshift-state-metrics-tls podName:71243495-3c08-450a-b8d9-dce03ef8be95 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:12.356493058 +0000 UTC m=+141.321273535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/71243495-3c08-450a-b8d9-dce03ef8be95-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-4ch4v" (UID: "71243495-3c08-450a-b8d9-dce03ef8be95") : secret "openshift-state-metrics-tls" not found Apr 16 16:25:11.856870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.856569 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-tls\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.856870 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:25:11.856687 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:25:11.856870 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:25:11.856722 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-tls podName:d8bf96d8-da41-4d71-80d1-f04a83e90145 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:12.356710273 +0000 UTC m=+141.321490750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-tls") pod "node-exporter-5x8g6" (UID: "d8bf96d8-da41-4d71-80d1-f04a83e90145") : secret "node-exporter-tls" not found Apr 16 16:25:11.858166 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.857260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8bf96d8-da41-4d71-80d1-f04a83e90145-metrics-client-ca\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.858166 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.857726 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-wtmp\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.858166 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.857780 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8bf96d8-da41-4d71-80d1-f04a83e90145-sys\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.858360 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.858264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-accelerators-collector-config\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.859121 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.859008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71243495-3c08-450a-b8d9-dce03ef8be95-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.859218 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.859178 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.859518 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.859433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.859629 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.859600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.859777 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.859759 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-textfile\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.860538 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.860496 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.862044 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.862023 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/71243495-3c08-450a-b8d9-dce03ef8be95-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.862154 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.862112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.866119 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.866031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgsbq\" (UniqueName: \"kubernetes.io/projected/c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb-kube-api-access-bgsbq\") pod \"kube-state-metrics-7479c89684-sx2kr\" (UID: \"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:11.866405 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.866385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmpr\" (UniqueName: \"kubernetes.io/projected/d8bf96d8-da41-4d71-80d1-f04a83e90145-kube-api-access-bzmpr\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:11.866603 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.866580 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5lkf\" (UniqueName: \"kubernetes.io/projected/71243495-3c08-450a-b8d9-dce03ef8be95-kube-api-access-m5lkf\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:11.957949 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:11.957914 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" Apr 16 16:25:12.111548 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:12.111515 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-sx2kr"] Apr 16 16:25:12.114580 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:25:12.114475 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc63e3e8b_c729_4ce7_af91_e9f6ee85dbdb.slice/crio-136b55d45fd09c7003c0da52f1c377ad7568b4bbc9b61496b66863da7211bb94 WatchSource:0}: Error finding container 136b55d45fd09c7003c0da52f1c377ad7568b4bbc9b61496b66863da7211bb94: Status 404 returned error can't find the container with id 136b55d45fd09c7003c0da52f1c377ad7568b4bbc9b61496b66863da7211bb94 Apr 16 16:25:12.360753 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:12.360648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-tls\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:12.360944 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:12.360778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/71243495-3c08-450a-b8d9-dce03ef8be95-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:12.360944 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:25:12.360832 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:25:12.360944 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:25:12.360924 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-tls podName:d8bf96d8-da41-4d71-80d1-f04a83e90145 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:13.360903566 +0000 UTC m=+142.325684045 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-tls") pod "node-exporter-5x8g6" (UID: "d8bf96d8-da41-4d71-80d1-f04a83e90145") : secret "node-exporter-tls" not found Apr 16 16:25:12.363401 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:12.363374 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/71243495-3c08-450a-b8d9-dce03ef8be95-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4ch4v\" (UID: \"71243495-3c08-450a-b8d9-dce03ef8be95\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:12.537362 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:12.537325 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" Apr 16 16:25:12.687366 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:12.687337 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v"] Apr 16 16:25:12.689906 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:25:12.689877 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71243495_3c08_450a_b8d9_dce03ef8be95.slice/crio-d3aba26592a52fd7f2ab1082b958eca3cb71e331fcb459ca05075aa818d7a8f3 WatchSource:0}: Error finding container d3aba26592a52fd7f2ab1082b958eca3cb71e331fcb459ca05075aa818d7a8f3: Status 404 returned error can't find the container with id d3aba26592a52fd7f2ab1082b958eca3cb71e331fcb459ca05075aa818d7a8f3 Apr 16 16:25:13.035610 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:13.035565 2577 patch_prober.go:28] interesting pod/image-registry-777984ddb8-n9rkz container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 16:25:13.035800 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:13.035640 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" podUID="33bea546-d2f7-4497-87b9-43156b40e189" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:25:13.078617 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:13.078553 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" event={"ID":"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb","Type":"ContainerStarted","Data":"136b55d45fd09c7003c0da52f1c377ad7568b4bbc9b61496b66863da7211bb94"} Apr 16 16:25:13.080619 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:13.080582 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" event={"ID":"71243495-3c08-450a-b8d9-dce03ef8be95","Type":"ContainerStarted","Data":"373000b707cdec5e4cc349558eac28a4d2f47e035633fcad381467832207e181"} Apr 16 16:25:13.080619 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:13.080618 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" event={"ID":"71243495-3c08-450a-b8d9-dce03ef8be95","Type":"ContainerStarted","Data":"24616f534cc5c37cfbb40e3fcf0bf7edf7b8fb9c65536bf06c6ba63967627d90"} Apr 16 16:25:13.080619 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:13.080628 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" event={"ID":"71243495-3c08-450a-b8d9-dce03ef8be95","Type":"ContainerStarted","Data":"d3aba26592a52fd7f2ab1082b958eca3cb71e331fcb459ca05075aa818d7a8f3"} Apr 16 16:25:13.369418 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:13.369319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-tls\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:13.371984 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:13.371958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8bf96d8-da41-4d71-80d1-f04a83e90145-node-exporter-tls\") pod \"node-exporter-5x8g6\" (UID: \"d8bf96d8-da41-4d71-80d1-f04a83e90145\") " pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:13.443646 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:13.443607 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5x8g6" Apr 16 16:25:13.554973 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:25:13.554940 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8bf96d8_da41_4d71_80d1_f04a83e90145.slice/crio-c29d40cdb296c7c9b2e06ddca2f24cb1cd73c2e3c0ac882ff20c99d9addd27bc WatchSource:0}: Error finding container c29d40cdb296c7c9b2e06ddca2f24cb1cd73c2e3c0ac882ff20c99d9addd27bc: Status 404 returned error can't find the container with id c29d40cdb296c7c9b2e06ddca2f24cb1cd73c2e3c0ac882ff20c99d9addd27bc Apr 16 16:25:14.087593 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:14.087549 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5x8g6" event={"ID":"d8bf96d8-da41-4d71-80d1-f04a83e90145","Type":"ContainerStarted","Data":"c29d40cdb296c7c9b2e06ddca2f24cb1cd73c2e3c0ac882ff20c99d9addd27bc"} Apr 16 16:25:14.090376 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:14.090344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" event={"ID":"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb","Type":"ContainerStarted","Data":"912b405265533385de9daad0833e372954484e01cc3c75978be736d36b76f593"} Apr 16 16:25:14.090551 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:14.090392 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" event={"ID":"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb","Type":"ContainerStarted","Data":"47c29f48b8e2756f74313e4cb26c8fff4484343fdb0cfacf3dbd8eb4d2fe72d5"} Apr 16 16:25:14.090551 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:14.090406 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" event={"ID":"c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb","Type":"ContainerStarted","Data":"4970a31ff7bb887a0b5c996a9a72c4b9a64d12522d435f9b884ffd321ce60363"} Apr 16 16:25:14.121284 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:14.121222 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-sx2kr" podStartSLOduration=1.5829092569999998 podStartE2EDuration="3.121201215s" podCreationTimestamp="2026-04-16 16:25:11 +0000 UTC" firstStartedPulling="2026-04-16 16:25:12.116585617 +0000 UTC m=+141.081366095" lastFinishedPulling="2026-04-16 16:25:13.654877574 +0000 UTC m=+142.619658053" observedRunningTime="2026-04-16 16:25:14.119699915 +0000 UTC m=+143.084480412" watchObservedRunningTime="2026-04-16 16:25:14.121201215 +0000 UTC m=+143.085981713" Apr 16 16:25:15.095775 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:15.095685 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" event={"ID":"71243495-3c08-450a-b8d9-dce03ef8be95","Type":"ContainerStarted","Data":"7498c6c09310e9a68be9385c5563de758476c710fff118cfe4be9b33270ee073"} Apr 16 16:25:15.097358 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:15.097328 2577 generic.go:358] "Generic (PLEG): container finished" podID="d8bf96d8-da41-4d71-80d1-f04a83e90145" containerID="c3fd648cc671e8269c299280fa8239ac5581c88b9eee19cf67b0d9023a0678d5" exitCode=0 Apr 16 16:25:15.097530 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:15.097405 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5x8g6" event={"ID":"d8bf96d8-da41-4d71-80d1-f04a83e90145","Type":"ContainerDied","Data":"c3fd648cc671e8269c299280fa8239ac5581c88b9eee19cf67b0d9023a0678d5"} Apr 16 16:25:15.117963 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:15.117923 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4ch4v" podStartSLOduration=2.29607094 podStartE2EDuration="4.11790872s" podCreationTimestamp="2026-04-16 16:25:11 +0000 UTC" firstStartedPulling="2026-04-16 16:25:12.843370883 +0000 UTC m=+141.808151361" lastFinishedPulling="2026-04-16 16:25:14.665208666 +0000 UTC m=+143.629989141" observedRunningTime="2026-04-16 16:25:15.115635797 +0000 UTC m=+144.080416293" watchObservedRunningTime="2026-04-16 16:25:15.11790872 +0000 UTC m=+144.082689258" Apr 16 16:25:16.102346 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:16.102310 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5x8g6" event={"ID":"d8bf96d8-da41-4d71-80d1-f04a83e90145","Type":"ContainerStarted","Data":"5adebc43a980c5b160757ede3410ce07db84b0584b3b76c5da374c29f7f12427"} Apr 16 16:25:16.102346 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:16.102350 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5x8g6" event={"ID":"d8bf96d8-da41-4d71-80d1-f04a83e90145","Type":"ContainerStarted","Data":"d5bea59f3ccfb8670170cdae396f1f20f18aaa0db695c6f023d8f46a9cd546c5"} Apr 16 16:25:16.131194 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:16.131133 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5x8g6" podStartSLOduration=4.024496474 podStartE2EDuration="5.131114917s" podCreationTimestamp="2026-04-16 16:25:11 +0000 UTC" firstStartedPulling="2026-04-16 16:25:13.556695924 +0000 UTC m=+142.521476397" lastFinishedPulling="2026-04-16 16:25:14.663314353 +0000 UTC m=+143.628094840" observedRunningTime="2026-04-16 16:25:16.129129729 +0000 UTC m=+145.093910226" watchObservedRunningTime="2026-04-16 16:25:16.131114917 +0000 UTC m=+145.095895412" Apr 16 16:25:18.004811 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.004773 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:25:18.010613 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.010585 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.016484 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.016424 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 16:25:18.016818 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.016652 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 16:25:18.016818 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.016663 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 16:25:18.016818 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.016693 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 16:25:18.016818 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.016735 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 16:25:18.016818 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.016652 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 16:25:18.017215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.017024 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 16:25:18.017587 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.017565 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 16:25:18.017717 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.017593 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-pnl98\"" Apr 16 16:25:18.017717 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.017624 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:25:18.017717 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.017569 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 16:25:18.017717 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.017647 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bjcmra9ed76oe\"" Apr 16 16:25:18.017921 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.017565 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 16:25:18.017921 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.017857 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 16:25:18.019632 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.019612 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 16:25:18.036583 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.036538 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:25:18.114508 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.114477 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.114683 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.114519 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.114683 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.114555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.114683 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.114635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.114683 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.114669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.114879 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.114812 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-config\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.114879 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.114850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.115024 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.114925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-config-out\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.115065 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.115047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.115117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.115089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.115174 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.115153 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.115231 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.115208 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.115363 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.115247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5frfv\" (UniqueName: \"kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-kube-api-access-5frfv\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.115363 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.115324 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-web-config\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.115363 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.115357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.115507 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.115382 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.115507 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.115411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.115507 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.115454 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216062 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-web-config\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216062 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216062 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216119 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-config\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216327 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216831 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-config-out\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216831 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216400 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216831 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216831 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.216831 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.216543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.218707 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.217470 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.218707 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.217944 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.219156 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.219134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.220203 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.219806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.220203 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.219862 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5frfv\" (UniqueName: \"kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-kube-api-access-5frfv\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.220907 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.220881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.221197 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.221179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-config-out\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.221675 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.221657 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-web-config\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.222907 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.222883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.222998 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.222904 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-config\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.222998 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.222917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.223113 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.223042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.223113 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.223054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.223335 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.223311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.223335 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.223325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.223639 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.223623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.224884 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.224866 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.225062 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.225043 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.229968 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.229947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5frfv\" (UniqueName: \"kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-kube-api-access-5frfv\") pod \"prometheus-k8s-0\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.323246 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.323161 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:18.466982 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:18.466919 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:25:18.471356 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:25:18.471318 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod106aaf56_f922_4d25_baa9_402d7df5662e.slice/crio-c102a9f80a74ff2bc6384273f6dedd3382ee8fa84c8a59b7f6e420b954717972 WatchSource:0}: Error finding container c102a9f80a74ff2bc6384273f6dedd3382ee8fa84c8a59b7f6e420b954717972: Status 404 returned error can't find the container with id c102a9f80a74ff2bc6384273f6dedd3382ee8fa84c8a59b7f6e420b954717972 Apr 16 16:25:19.119518 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:19.119479 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerStarted","Data":"c102a9f80a74ff2bc6384273f6dedd3382ee8fa84c8a59b7f6e420b954717972"} Apr 16 16:25:20.123699 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:20.123662 2577 generic.go:358] "Generic (PLEG): container finished" podID="106aaf56-f922-4d25-baa9-402d7df5662e" containerID="92c0cb84c06ba5ce76e5cfe0594f5172873fc35f285649a704a2cfa611551116" exitCode=0 Apr 16 16:25:20.124060 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:20.123747 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerDied","Data":"92c0cb84c06ba5ce76e5cfe0594f5172873fc35f285649a704a2cfa611551116"} Apr 16 16:25:23.036662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:23.036626 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:25:24.137165 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:24.137132 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerStarted","Data":"0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94"} Apr 16 16:25:24.137165 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:24.137171 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerStarted","Data":"3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd"} Apr 16 16:25:25.053741 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:25.053719 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7c6b454dd-p6wm8" Apr 16 16:25:25.143316 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:25.143290 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerStarted","Data":"ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c"} Apr 16 16:25:25.143657 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:25.143323 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerStarted","Data":"3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c"} Apr 16 16:25:26.148888 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:26.148848 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerStarted","Data":"a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec"} Apr 16 16:25:26.148888 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:26.148888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerStarted","Data":"5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e"} Apr 16 16:25:26.184188 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:26.184138 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.707454412 podStartE2EDuration="9.184123755s" podCreationTimestamp="2026-04-16 16:25:17 +0000 UTC" firstStartedPulling="2026-04-16 16:25:18.474017456 +0000 UTC m=+147.438797930" lastFinishedPulling="2026-04-16 16:25:24.950686794 +0000 UTC m=+153.915467273" observedRunningTime="2026-04-16 16:25:26.181412581 +0000 UTC m=+155.146193090" watchObservedRunningTime="2026-04-16 16:25:26.184123755 +0000 UTC m=+155.148904250" Apr 16 16:25:27.917869 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:25:27.917832 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-pfv5k" podUID="8016a568-6fe9-4dfc-a543-f50b2768e5b2" Apr 16 16:25:27.933101 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:25:27.933073 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-z5t69" podUID="461b689e-a41b-4182-ba52-e26a1dfbc007" Apr 16 16:25:28.054822 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.054763 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" podUID="33bea546-d2f7-4497-87b9-43156b40e189" containerName="registry" containerID="cri-o://3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54" gracePeriod=30 Apr 16 16:25:28.154677 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.154652 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pfv5k" Apr 16 16:25:28.298885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.298865 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:25:28.324017 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.323993 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:28.414242 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.414217 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-registry-certificates\") pod \"33bea546-d2f7-4497-87b9-43156b40e189\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " Apr 16 16:25:28.414393 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.414282 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7fwl\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-kube-api-access-m7fwl\") pod \"33bea546-d2f7-4497-87b9-43156b40e189\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " Apr 16 16:25:28.414393 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.414310 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls\") pod \"33bea546-d2f7-4497-87b9-43156b40e189\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " Apr 16 16:25:28.414543 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.414408 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33bea546-d2f7-4497-87b9-43156b40e189-ca-trust-extracted\") pod \"33bea546-d2f7-4497-87b9-43156b40e189\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " Apr 16 16:25:28.414543 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.414505 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-image-registry-private-configuration\") pod \"33bea546-d2f7-4497-87b9-43156b40e189\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " Apr 16 16:25:28.414650 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.414565 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-trusted-ca\") pod \"33bea546-d2f7-4497-87b9-43156b40e189\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " Apr 16 16:25:28.414650 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.414588 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-bound-sa-token\") pod \"33bea546-d2f7-4497-87b9-43156b40e189\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " Apr 16 16:25:28.414752 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.414656 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-installation-pull-secrets\") pod \"33bea546-d2f7-4497-87b9-43156b40e189\" (UID: \"33bea546-d2f7-4497-87b9-43156b40e189\") " Apr 16 16:25:28.414752 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.414708 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "33bea546-d2f7-4497-87b9-43156b40e189" (UID: "33bea546-d2f7-4497-87b9-43156b40e189"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:28.416845 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.416359 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-registry-certificates\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:25:28.417367 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.417134 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "33bea546-d2f7-4497-87b9-43156b40e189" (UID: "33bea546-d2f7-4497-87b9-43156b40e189"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:28.419273 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.419245 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "33bea546-d2f7-4497-87b9-43156b40e189" (UID: "33bea546-d2f7-4497-87b9-43156b40e189"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:28.419410 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.419355 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "33bea546-d2f7-4497-87b9-43156b40e189" (UID: "33bea546-d2f7-4497-87b9-43156b40e189"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:28.419493 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.419417 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "33bea546-d2f7-4497-87b9-43156b40e189" (UID: "33bea546-d2f7-4497-87b9-43156b40e189"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:28.419538 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.419503 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "33bea546-d2f7-4497-87b9-43156b40e189" (UID: "33bea546-d2f7-4497-87b9-43156b40e189"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:28.419747 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.419717 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-kube-api-access-m7fwl" (OuterVolumeSpecName: "kube-api-access-m7fwl") pod "33bea546-d2f7-4497-87b9-43156b40e189" (UID: "33bea546-d2f7-4497-87b9-43156b40e189"). InnerVolumeSpecName "kube-api-access-m7fwl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:28.424130 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.424104 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33bea546-d2f7-4497-87b9-43156b40e189-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "33bea546-d2f7-4497-87b9-43156b40e189" (UID: "33bea546-d2f7-4497-87b9-43156b40e189"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:25:28.517239 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.517214 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m7fwl\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-kube-api-access-m7fwl\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:25:28.517239 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.517238 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-registry-tls\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:25:28.517368 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.517248 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33bea546-d2f7-4497-87b9-43156b40e189-ca-trust-extracted\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:25:28.517368 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.517259 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-image-registry-private-configuration\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:25:28.517368 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.517270 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33bea546-d2f7-4497-87b9-43156b40e189-trusted-ca\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:25:28.517368 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.517280 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33bea546-d2f7-4497-87b9-43156b40e189-bound-sa-token\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:25:28.517368 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:28.517288 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33bea546-d2f7-4497-87b9-43156b40e189-installation-pull-secrets\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:25:29.158141 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:29.158106 2577 generic.go:358] "Generic (PLEG): container finished" podID="33bea546-d2f7-4497-87b9-43156b40e189" containerID="3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54" exitCode=0 Apr 16 16:25:29.158616 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:29.158197 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" Apr 16 16:25:29.158616 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:29.158199 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" event={"ID":"33bea546-d2f7-4497-87b9-43156b40e189","Type":"ContainerDied","Data":"3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54"} Apr 16 16:25:29.158616 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:29.158240 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-777984ddb8-n9rkz" event={"ID":"33bea546-d2f7-4497-87b9-43156b40e189","Type":"ContainerDied","Data":"47d5c90fd1bd207d5c77b850b7a4d88ab42153ded2b7d41a54b345d64d439c14"} Apr 16 16:25:29.158616 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:29.158258 2577 scope.go:117] "RemoveContainer" containerID="3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54" Apr 16 16:25:29.167687 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:29.167670 2577 scope.go:117] "RemoveContainer" containerID="3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54" Apr 16 16:25:29.167945 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:25:29.167925 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54\": container with ID starting with 3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54 not found: ID does not exist" containerID="3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54" Apr 16 16:25:29.168011 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:29.167956 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54"} err="failed to get container status \"3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54\": rpc error: code = NotFound desc = could not find container \"3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54\": container with ID starting with 3505418c19bd28ae1aa61f2f43d0329c0aed31a203471dc349e6ad7321abaf54 not found: ID does not exist" Apr 16 16:25:29.184331 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:29.184310 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-777984ddb8-n9rkz"] Apr 16 16:25:29.186135 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:29.186092 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-777984ddb8-n9rkz"] Apr 16 16:25:29.650290 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:29.650201 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33bea546-d2f7-4497-87b9-43156b40e189" path="/var/lib/kubelet/pods/33bea546-d2f7-4497-87b9-43156b40e189/volumes" Apr 16 16:25:32.854086 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:32.854056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:25:32.854522 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:32.854106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:25:32.856518 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:32.856496 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8016a568-6fe9-4dfc-a543-f50b2768e5b2-metrics-tls\") pod \"dns-default-pfv5k\" (UID: \"8016a568-6fe9-4dfc-a543-f50b2768e5b2\") " pod="openshift-dns/dns-default-pfv5k" Apr 16 16:25:32.856633 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:32.856597 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/461b689e-a41b-4182-ba52-e26a1dfbc007-cert\") pod \"ingress-canary-z5t69\" (UID: \"461b689e-a41b-4182-ba52-e26a1dfbc007\") " pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:25:32.959045 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:32.959012 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2bd8s\"" Apr 16 16:25:32.965416 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:32.965388 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pfv5k" Apr 16 16:25:33.107183 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:33.107157 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pfv5k"] Apr 16 16:25:33.111428 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:25:33.111399 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8016a568_6fe9_4dfc_a543_f50b2768e5b2.slice/crio-95cfb76695e48d9a6c4cec87807d9e7749668de7f1b4d0af744da020f6c85a30 WatchSource:0}: Error finding container 95cfb76695e48d9a6c4cec87807d9e7749668de7f1b4d0af744da020f6c85a30: Status 404 returned error can't find the container with id 95cfb76695e48d9a6c4cec87807d9e7749668de7f1b4d0af744da020f6c85a30 Apr 16 16:25:33.170899 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:33.170859 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pfv5k" event={"ID":"8016a568-6fe9-4dfc-a543-f50b2768e5b2","Type":"ContainerStarted","Data":"95cfb76695e48d9a6c4cec87807d9e7749668de7f1b4d0af744da020f6c85a30"} Apr 16 16:25:35.177642 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:35.177611 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pfv5k" event={"ID":"8016a568-6fe9-4dfc-a543-f50b2768e5b2","Type":"ContainerStarted","Data":"12493cc10a769c840bedc72fe6c26fe67049de5d46d34ea7142a010af8b9edc4"} Apr 16 16:25:35.177642 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:35.177645 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pfv5k" event={"ID":"8016a568-6fe9-4dfc-a543-f50b2768e5b2","Type":"ContainerStarted","Data":"11d44d0483d5f529d84ae9af21f02ffb8d73c6cdb917a5d6209d3b4b7555cc36"} Apr 16 16:25:35.178083 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:35.177715 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pfv5k" Apr 16 16:25:35.204881 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:35.204831 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pfv5k" podStartSLOduration=129.99538306 podStartE2EDuration="2m11.204815419s" podCreationTimestamp="2026-04-16 16:23:24 +0000 UTC" firstStartedPulling="2026-04-16 16:25:33.114315089 +0000 UTC m=+162.079095569" lastFinishedPulling="2026-04-16 16:25:34.323747454 +0000 UTC m=+163.288527928" observedRunningTime="2026-04-16 16:25:35.204519842 +0000 UTC m=+164.169300339" watchObservedRunningTime="2026-04-16 16:25:35.204815419 +0000 UTC m=+164.169595914" Apr 16 16:25:43.646791 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:43.646702 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:25:43.652511 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:43.652487 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrwds\"" Apr 16 16:25:43.657832 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:43.657804 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z5t69" Apr 16 16:25:43.788039 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:43.788009 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z5t69"] Apr 16 16:25:43.791281 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:25:43.791256 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod461b689e_a41b_4182_ba52_e26a1dfbc007.slice/crio-305013385f2c010f0ad9d38b4f7470eacf36f55b71c36032f282669303934335 WatchSource:0}: Error finding container 305013385f2c010f0ad9d38b4f7470eacf36f55b71c36032f282669303934335: Status 404 returned error can't find the container with id 305013385f2c010f0ad9d38b4f7470eacf36f55b71c36032f282669303934335 Apr 16 16:25:44.207091 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:44.207055 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z5t69" event={"ID":"461b689e-a41b-4182-ba52-e26a1dfbc007","Type":"ContainerStarted","Data":"305013385f2c010f0ad9d38b4f7470eacf36f55b71c36032f282669303934335"} Apr 16 16:25:45.186846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:45.186814 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pfv5k" Apr 16 16:25:46.213250 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:46.213218 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z5t69" event={"ID":"461b689e-a41b-4182-ba52-e26a1dfbc007","Type":"ContainerStarted","Data":"a4d134fd6f8b5a48c05fdb683ed0374861941bc8c092b492429bad58eafe2904"} Apr 16 16:25:46.236629 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:46.236574 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z5t69" podStartSLOduration=140.633173689 podStartE2EDuration="2m22.236558449s" podCreationTimestamp="2026-04-16 16:23:24 +0000 UTC" firstStartedPulling="2026-04-16 16:25:43.793159581 +0000 UTC m=+172.757940059" lastFinishedPulling="2026-04-16 16:25:45.396544342 +0000 UTC m=+174.361324819" observedRunningTime="2026-04-16 16:25:46.234882053 +0000 UTC m=+175.199662549" watchObservedRunningTime="2026-04-16 16:25:46.236558449 +0000 UTC m=+175.201338945" Apr 16 16:25:56.243833 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:56.243797 2577 generic.go:358] "Generic (PLEG): container finished" podID="228b5774-6748-4592-bb81-0b7f69e4dcc8" containerID="791f2d762edf2afefc052a416b911d216db4376dfd488cc7f50d22e40493fa32" exitCode=0 Apr 16 16:25:56.244210 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:56.243857 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" event={"ID":"228b5774-6748-4592-bb81-0b7f69e4dcc8","Type":"ContainerDied","Data":"791f2d762edf2afefc052a416b911d216db4376dfd488cc7f50d22e40493fa32"} Apr 16 16:25:56.244210 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:56.244205 2577 scope.go:117] "RemoveContainer" containerID="791f2d762edf2afefc052a416b911d216db4376dfd488cc7f50d22e40493fa32" Apr 16 16:25:57.249289 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:25:57.249257 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-txg78" event={"ID":"228b5774-6748-4592-bb81-0b7f69e4dcc8","Type":"ContainerStarted","Data":"b7fc618241c092a643270180907823b5c34f05fe85ee83a62c924a6093f28ed5"} Apr 16 16:26:00.261882 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:00.261847 2577 generic.go:358] "Generic (PLEG): container finished" podID="df5ba034-578f-423e-919b-afdf8297d467" containerID="4023430d0847a834a9e447a23ef7067986767487a2ef135fbf96950ad656b253" exitCode=0 Apr 16 16:26:00.262250 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:00.261897 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" event={"ID":"df5ba034-578f-423e-919b-afdf8297d467","Type":"ContainerDied","Data":"4023430d0847a834a9e447a23ef7067986767487a2ef135fbf96950ad656b253"} Apr 16 16:26:00.262250 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:00.262222 2577 scope.go:117] "RemoveContainer" containerID="4023430d0847a834a9e447a23ef7067986767487a2ef135fbf96950ad656b253" Apr 16 16:26:01.267195 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:01.267161 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-f44hl" event={"ID":"df5ba034-578f-423e-919b-afdf8297d467","Type":"ContainerStarted","Data":"7090afdd9aa41b52c1f23698685267f089e281e1049532186fca3ea5f0b7a132"} Apr 16 16:26:18.323831 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:18.323792 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:18.343185 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:18.343156 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:19.337385 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:19.337360 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:36.336957 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:36.336914 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:26:36.337630 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:36.337592 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="prometheus" containerID="cri-o://3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd" gracePeriod=600 Apr 16 16:26:36.338009 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:36.337978 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy-thanos" containerID="cri-o://a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec" gracePeriod=600 Apr 16 16:26:36.338112 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:36.338077 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy" containerID="cri-o://5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e" gracePeriod=600 Apr 16 16:26:36.338174 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:36.338153 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy-web" containerID="cri-o://ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c" gracePeriod=600 Apr 16 16:26:36.338241 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:36.338213 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="thanos-sidecar" containerID="cri-o://3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c" gracePeriod=600 Apr 16 16:26:36.338297 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:36.338270 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="config-reloader" containerID="cri-o://0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94" gracePeriod=600 Apr 16 16:26:37.385351 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.385319 2577 generic.go:358] "Generic (PLEG): container finished" podID="106aaf56-f922-4d25-baa9-402d7df5662e" containerID="a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec" exitCode=0 Apr 16 16:26:37.385351 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.385343 2577 generic.go:358] "Generic (PLEG): container finished" podID="106aaf56-f922-4d25-baa9-402d7df5662e" containerID="5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e" exitCode=0 Apr 16 16:26:37.385351 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.385351 2577 generic.go:358] "Generic (PLEG): container finished" podID="106aaf56-f922-4d25-baa9-402d7df5662e" containerID="3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c" exitCode=0 Apr 16 16:26:37.385351 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.385357 2577 generic.go:358] "Generic (PLEG): container finished" podID="106aaf56-f922-4d25-baa9-402d7df5662e" containerID="0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94" exitCode=0 Apr 16 16:26:37.385351 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.385362 2577 generic.go:358] "Generic (PLEG): container finished" podID="106aaf56-f922-4d25-baa9-402d7df5662e" containerID="3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd" exitCode=0 Apr 16 16:26:37.385861 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.385397 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerDied","Data":"a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec"} Apr 16 16:26:37.385861 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.385433 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerDied","Data":"5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e"} Apr 16 16:26:37.385861 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.385457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerDied","Data":"3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c"} Apr 16 16:26:37.385861 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.385467 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerDied","Data":"0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94"} Apr 16 16:26:37.385861 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.385477 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerDied","Data":"3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd"} Apr 16 16:26:37.585074 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.585050 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:37.687948 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.687912 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-config\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.687948 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.687951 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-serving-certs-ca-bundle\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.687981 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-kube-rbac-proxy\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688000 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-tls\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688025 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-db\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688049 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-metrics-client-certs\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688063 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-rulefiles-0\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688085 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-tls-assets\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688109 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-kubelet-serving-ca-bundle\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688141 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-grpc-tls\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688167 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-config-out\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688195 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-trusted-ca-bundle\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688744 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688239 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-thanos-prometheus-http-client-file\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688744 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688269 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-web-config\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688744 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688302 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688744 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688332 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-metrics-client-ca\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688744 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688357 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5frfv\" (UniqueName: \"kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-kube-api-access-5frfv\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688744 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688392 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"106aaf56-f922-4d25-baa9-402d7df5662e\" (UID: \"106aaf56-f922-4d25-baa9-402d7df5662e\") " Apr 16 16:26:37.688744 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688604 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:37.688744 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688658 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:37.689129 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688752 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.689129 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.688770 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-trusted-ca-bundle\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.689807 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.689510 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:26:37.690348 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.690087 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:37.690348 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.690095 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:37.691436 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.691335 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:37.691436 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.691374 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:37.691436 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.691401 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:37.691653 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.691496 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:37.691890 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.691861 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:37.692006 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.691953 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:37.692316 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.692276 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-config" (OuterVolumeSpecName: "config") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:37.693199 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.693153 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:37.693470 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.693417 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:26:37.693470 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.693431 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-kube-api-access-5frfv" (OuterVolumeSpecName: "kube-api-access-5frfv") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "kube-api-access-5frfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:26:37.693615 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.693515 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-config-out" (OuterVolumeSpecName: "config-out") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:26:37.694343 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.694324 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:37.704705 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.704531 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-web-config" (OuterVolumeSpecName: "web-config") pod "106aaf56-f922-4d25-baa9-402d7df5662e" (UID: "106aaf56-f922-4d25-baa9-402d7df5662e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:37.789997 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.789968 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-grpc-tls\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.789997 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.789993 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-config-out\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.789997 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790003 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-thanos-prometheus-http-client-file\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790013 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-web-config\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790023 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790032 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-metrics-client-ca\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790043 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5frfv\" (UniqueName: \"kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-kube-api-access-5frfv\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790052 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790061 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-config\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790073 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790083 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-kube-rbac-proxy\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790091 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-prometheus-k8s-tls\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790100 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-db\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790109 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/106aaf56-f922-4d25-baa9-402d7df5662e-secret-metrics-client-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790118 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/106aaf56-f922-4d25-baa9-402d7df5662e-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:37.790212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:37.790126 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/106aaf56-f922-4d25-baa9-402d7df5662e-tls-assets\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:26:38.394111 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.394066 2577 generic.go:358] "Generic (PLEG): container finished" podID="106aaf56-f922-4d25-baa9-402d7df5662e" containerID="ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c" exitCode=0 Apr 16 16:26:38.394601 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.394145 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerDied","Data":"ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c"} Apr 16 16:26:38.394601 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.394190 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"106aaf56-f922-4d25-baa9-402d7df5662e","Type":"ContainerDied","Data":"c102a9f80a74ff2bc6384273f6dedd3382ee8fa84c8a59b7f6e420b954717972"} Apr 16 16:26:38.394601 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.394198 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.394601 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.394220 2577 scope.go:117] "RemoveContainer" containerID="a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec" Apr 16 16:26:38.406830 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.406796 2577 scope.go:117] "RemoveContainer" containerID="5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e" Apr 16 16:26:38.416044 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.416021 2577 scope.go:117] "RemoveContainer" containerID="ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c" Apr 16 16:26:38.423330 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.423308 2577 scope.go:117] "RemoveContainer" containerID="3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c" Apr 16 16:26:38.430565 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.430530 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:26:38.431206 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.431191 2577 scope.go:117] "RemoveContainer" containerID="0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94" Apr 16 16:26:38.435709 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.435680 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:26:38.440160 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.440140 2577 scope.go:117] "RemoveContainer" containerID="3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd" Apr 16 16:26:38.447785 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.447765 2577 scope.go:117] "RemoveContainer" containerID="92c0cb84c06ba5ce76e5cfe0594f5172873fc35f285649a704a2cfa611551116" Apr 16 16:26:38.455187 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.455169 2577 scope.go:117] "RemoveContainer" containerID="a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec" Apr 16 16:26:38.455513 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:26:38.455495 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec\": container with ID starting with a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec not found: ID does not exist" containerID="a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec" Apr 16 16:26:38.455567 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.455522 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec"} err="failed to get container status \"a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec\": rpc error: code = NotFound desc = could not find container \"a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec\": container with ID starting with a7e4201f793072b944613794d4927d81322e3d0bc595e5220ed3c8d3758be6ec not found: ID does not exist" Apr 16 16:26:38.455567 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.455540 2577 scope.go:117] "RemoveContainer" containerID="5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e" Apr 16 16:26:38.455788 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:26:38.455769 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e\": container with ID starting with 5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e not found: ID does not exist" containerID="5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e" Apr 16 16:26:38.455828 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.455797 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e"} err="failed to get container status \"5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e\": rpc error: code = NotFound desc = could not find container \"5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e\": container with ID starting with 5fe0db7a6a3ce18a071816b3f4cbd62f463b9d53ecb275bc796f5a8ef2ed003e not found: ID does not exist" Apr 16 16:26:38.455828 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.455814 2577 scope.go:117] "RemoveContainer" containerID="ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c" Apr 16 16:26:38.456047 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:26:38.456032 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c\": container with ID starting with ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c not found: ID does not exist" containerID="ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c" Apr 16 16:26:38.456100 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.456050 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c"} err="failed to get container status \"ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c\": rpc error: code = NotFound desc = could not find container \"ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c\": container with ID starting with ce2badfc7dac5f16a636dd79e7991f111408f058a8bcdf2bc7ed355829dc608c not found: ID does not exist" Apr 16 16:26:38.456100 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.456065 2577 scope.go:117] "RemoveContainer" containerID="3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c" Apr 16 16:26:38.456303 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:26:38.456286 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c\": container with ID starting with 3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c not found: ID does not exist" containerID="3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c" Apr 16 16:26:38.456344 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.456308 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c"} err="failed to get container status \"3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c\": rpc error: code = NotFound desc = could not find container \"3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c\": container with ID starting with 3afbc5bec2bfe249a7e4ea620bf1592f2998c8725344138c3d7eb9a265dd102c not found: ID does not exist" Apr 16 16:26:38.456344 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.456322 2577 scope.go:117] "RemoveContainer" containerID="0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94" Apr 16 16:26:38.456606 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:26:38.456587 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94\": container with ID starting with 0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94 not found: ID does not exist" containerID="0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94" Apr 16 16:26:38.456661 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.456611 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94"} err="failed to get container status \"0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94\": rpc error: code = NotFound desc = could not find container \"0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94\": container with ID starting with 0b7570ef67bb921199f809794ea4bca042b4602b75819c8368bb1aeacc98aa94 not found: ID does not exist" Apr 16 16:26:38.456661 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.456635 2577 scope.go:117] "RemoveContainer" containerID="3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd" Apr 16 16:26:38.456849 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:26:38.456835 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd\": container with ID starting with 3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd not found: ID does not exist" containerID="3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd" Apr 16 16:26:38.456892 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.456852 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd"} err="failed to get container status \"3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd\": rpc error: code = NotFound desc = could not find container \"3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd\": container with ID starting with 3422557698fe15b676a75d865587e883b3a9ac3c5f404c0e6a75faf1af0713bd not found: ID does not exist" Apr 16 16:26:38.456892 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.456864 2577 scope.go:117] "RemoveContainer" containerID="92c0cb84c06ba5ce76e5cfe0594f5172873fc35f285649a704a2cfa611551116" Apr 16 16:26:38.457048 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:26:38.457034 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c0cb84c06ba5ce76e5cfe0594f5172873fc35f285649a704a2cfa611551116\": container with ID starting with 92c0cb84c06ba5ce76e5cfe0594f5172873fc35f285649a704a2cfa611551116 not found: ID does not exist" containerID="92c0cb84c06ba5ce76e5cfe0594f5172873fc35f285649a704a2cfa611551116" Apr 16 16:26:38.457089 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.457050 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c0cb84c06ba5ce76e5cfe0594f5172873fc35f285649a704a2cfa611551116"} err="failed to get container status \"92c0cb84c06ba5ce76e5cfe0594f5172873fc35f285649a704a2cfa611551116\": rpc error: code = NotFound desc = could not find container \"92c0cb84c06ba5ce76e5cfe0594f5172873fc35f285649a704a2cfa611551116\": container with ID starting with 92c0cb84c06ba5ce76e5cfe0594f5172873fc35f285649a704a2cfa611551116 not found: ID does not exist" Apr 16 16:26:38.460597 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460568 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:26:38.460909 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460895 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="prometheus" Apr 16 16:26:38.460909 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460909 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="prometheus" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460921 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="thanos-sidecar" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460929 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="thanos-sidecar" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460941 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460947 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460954 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33bea546-d2f7-4497-87b9-43156b40e189" containerName="registry" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460959 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bea546-d2f7-4497-87b9-43156b40e189" containerName="registry" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460968 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="config-reloader" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460973 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="config-reloader" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460979 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy-thanos" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460984 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy-thanos" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460994 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy-web" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.460999 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy-web" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.461005 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="init-config-reloader" Apr 16 16:26:38.461031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.461010 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="init-config-reloader" Apr 16 16:26:38.461544 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.461058 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="prometheus" Apr 16 16:26:38.461544 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.461068 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="33bea546-d2f7-4497-87b9-43156b40e189" containerName="registry" Apr 16 16:26:38.461544 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.461074 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="config-reloader" Apr 16 16:26:38.461544 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.461081 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="thanos-sidecar" Apr 16 16:26:38.461544 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.461087 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy" Apr 16 16:26:38.461544 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.461094 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy-web" Apr 16 16:26:38.461544 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.461101 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" containerName="kube-rbac-proxy-thanos" Apr 16 16:26:38.466592 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.466569 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.472857 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.472828 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 16:26:38.473016 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.472938 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:26:38.473322 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.473137 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 16:26:38.474436 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.474415 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 16:26:38.474580 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.474519 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 16:26:38.475054 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.474916 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 16:26:38.475054 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.474927 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 16:26:38.475054 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.474943 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-pnl98\"" Apr 16 16:26:38.475054 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.474979 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 16:26:38.475054 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.475021 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bjcmra9ed76oe\"" Apr 16 16:26:38.477943 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.475495 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 16:26:38.477943 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.476114 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 16:26:38.477943 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.476327 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 16:26:38.479142 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.479117 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 16:26:38.481479 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.481439 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 16:26:38.481588 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.481570 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:26:38.596643 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.596643 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/462c2651-1d30-4908-9371-dc7b66a64e53-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.596879 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kqvh\" (UniqueName: \"kubernetes.io/projected/462c2651-1d30-4908-9371-dc7b66a64e53-kube-api-access-6kqvh\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.596879 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/462c2651-1d30-4908-9371-dc7b66a64e53-config-out\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.596879 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.596879 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.596879 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596844 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-web-config\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.596879 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.596879 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596879 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.597132 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596902 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/462c2651-1d30-4908-9371-dc7b66a64e53-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.597132 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.597132 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.596979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.597132 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.597006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.597132 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.597043 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.597132 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.597059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-config\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.597132 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.597084 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.597132 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.597104 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.597132 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.597137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.698413 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.698413 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.698667 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.698667 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/462c2651-1d30-4908-9371-dc7b66a64e53-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.698667 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kqvh\" (UniqueName: \"kubernetes.io/projected/462c2651-1d30-4908-9371-dc7b66a64e53-kube-api-access-6kqvh\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.698667 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/462c2651-1d30-4908-9371-dc7b66a64e53-config-out\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.698667 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.698667 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698589 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.698667 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-web-config\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.698667 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.698667 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.699061 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698690 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/462c2651-1d30-4908-9371-dc7b66a64e53-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.699061 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698713 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.699061 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698743 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.699061 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.699061 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.699061 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-config\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.699061 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.698895 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.699406 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.699280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.699406 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.699289 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/462c2651-1d30-4908-9371-dc7b66a64e53-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.699406 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.699284 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.701709 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.701679 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/462c2651-1d30-4908-9371-dc7b66a64e53-config-out\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.701907 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.701879 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.701979 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.701919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.702042 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.701978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.702042 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.702006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.702873 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.702624 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/462c2651-1d30-4908-9371-dc7b66a64e53-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.702873 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.702782 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.703046 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.702888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-web-config\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.703046 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.702943 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.703157 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.703132 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462c2651-1d30-4908-9371-dc7b66a64e53-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.704184 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.704153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.704426 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.704403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.705050 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.705026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.705351 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.705334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/462c2651-1d30-4908-9371-dc7b66a64e53-config\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.706835 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.706813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kqvh\" (UniqueName: \"kubernetes.io/projected/462c2651-1d30-4908-9371-dc7b66a64e53-kube-api-access-6kqvh\") pod \"prometheus-k8s-0\" (UID: \"462c2651-1d30-4908-9371-dc7b66a64e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.777764 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.777725 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:38.911236 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:38.911210 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:26:38.914238 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:26:38.914208 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod462c2651_1d30_4908_9371_dc7b66a64e53.slice/crio-b1e234062fb96ee2d07efae801b2660c048c4be17897f584854685e03b2bdb22 WatchSource:0}: Error finding container b1e234062fb96ee2d07efae801b2660c048c4be17897f584854685e03b2bdb22: Status 404 returned error can't find the container with id b1e234062fb96ee2d07efae801b2660c048c4be17897f584854685e03b2bdb22 Apr 16 16:26:39.399344 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:39.399252 2577 generic.go:358] "Generic (PLEG): container finished" podID="462c2651-1d30-4908-9371-dc7b66a64e53" containerID="7685fe50601215dfa0dd9f771b5a1a895e56c67f827edb149247aa4faacde362" exitCode=0 Apr 16 16:26:39.399698 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:39.399345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"462c2651-1d30-4908-9371-dc7b66a64e53","Type":"ContainerDied","Data":"7685fe50601215dfa0dd9f771b5a1a895e56c67f827edb149247aa4faacde362"} Apr 16 16:26:39.399698 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:39.399380 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"462c2651-1d30-4908-9371-dc7b66a64e53","Type":"ContainerStarted","Data":"b1e234062fb96ee2d07efae801b2660c048c4be17897f584854685e03b2bdb22"} Apr 16 16:26:39.657745 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:39.656967 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106aaf56-f922-4d25-baa9-402d7df5662e" path="/var/lib/kubelet/pods/106aaf56-f922-4d25-baa9-402d7df5662e/volumes" Apr 16 16:26:40.405636 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:40.405597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"462c2651-1d30-4908-9371-dc7b66a64e53","Type":"ContainerStarted","Data":"181d35eb8fb64696360675a229b397725ff9a9e7b716f94d7e0a443cf2e0f000"} Apr 16 16:26:40.405636 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:40.405641 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"462c2651-1d30-4908-9371-dc7b66a64e53","Type":"ContainerStarted","Data":"0bafa3d20b7f10a6c1c9073eade6fbd4280d0950466cc5b36c7386111f9a0d91"} Apr 16 16:26:40.406120 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:40.405653 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"462c2651-1d30-4908-9371-dc7b66a64e53","Type":"ContainerStarted","Data":"a41fc5b8e0151538064f08cb521fcdd44bbbedc5ddeac656b5d36d822c6c8ffd"} Apr 16 16:26:40.406120 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:40.405662 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"462c2651-1d30-4908-9371-dc7b66a64e53","Type":"ContainerStarted","Data":"43b849eca8853e453012b7f11f92308848adf97b27c71a227bb07ea3acb23e93"} Apr 16 16:26:40.406120 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:40.405670 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"462c2651-1d30-4908-9371-dc7b66a64e53","Type":"ContainerStarted","Data":"3dc8c507e7e4af7bb041fa7001aedbaca10b09d9dfa66e974826bd7657a67fea"} Apr 16 16:26:40.406120 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:40.405679 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"462c2651-1d30-4908-9371-dc7b66a64e53","Type":"ContainerStarted","Data":"d64730ad9eac4962c9a43d181650c40d1dbe333949d1351851cac30e6b5d8a80"} Apr 16 16:26:40.435859 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:40.435802 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.43576177 podStartE2EDuration="2.43576177s" podCreationTimestamp="2026-04-16 16:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:26:40.434038866 +0000 UTC m=+229.398819389" watchObservedRunningTime="2026-04-16 16:26:40.43576177 +0000 UTC m=+229.400542267" Apr 16 16:26:43.778202 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:26:43.778163 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:27:38.778931 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:27:38.778887 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:27:38.794533 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:27:38.794500 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:27:39.592772 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:27:39.592742 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:27:51.517973 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:27:51.517942 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:27:51.519060 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:27:51.519035 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:27:51.524602 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:27:51.524583 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:31:20.117467 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.117420 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8bkn7"] Apr 16 16:31:20.120890 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.120864 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" Apr 16 16:31:20.123763 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.123732 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 16:31:20.124710 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.124688 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 16:31:20.124710 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.124686 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-4ql66\"" Apr 16 16:31:20.130911 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.130878 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8bkn7"] Apr 16 16:31:20.141624 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.141582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1519e8da-b35a-4b68-b642-2960664dd605-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8bkn7\" (UID: \"1519e8da-b35a-4b68-b642-2960664dd605\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" Apr 16 16:31:20.141792 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.141637 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk4tv\" (UniqueName: \"kubernetes.io/projected/1519e8da-b35a-4b68-b642-2960664dd605-kube-api-access-xk4tv\") pod \"cert-manager-webhook-597b96b99b-8bkn7\" (UID: \"1519e8da-b35a-4b68-b642-2960664dd605\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" Apr 16 16:31:20.242859 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.242818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1519e8da-b35a-4b68-b642-2960664dd605-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8bkn7\" (UID: \"1519e8da-b35a-4b68-b642-2960664dd605\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" Apr 16 16:31:20.243040 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.242879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xk4tv\" (UniqueName: \"kubernetes.io/projected/1519e8da-b35a-4b68-b642-2960664dd605-kube-api-access-xk4tv\") pod \"cert-manager-webhook-597b96b99b-8bkn7\" (UID: \"1519e8da-b35a-4b68-b642-2960664dd605\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" Apr 16 16:31:20.251304 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.251264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1519e8da-b35a-4b68-b642-2960664dd605-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8bkn7\" (UID: \"1519e8da-b35a-4b68-b642-2960664dd605\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" Apr 16 16:31:20.251497 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.251413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk4tv\" (UniqueName: \"kubernetes.io/projected/1519e8da-b35a-4b68-b642-2960664dd605-kube-api-access-xk4tv\") pod \"cert-manager-webhook-597b96b99b-8bkn7\" (UID: \"1519e8da-b35a-4b68-b642-2960664dd605\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" Apr 16 16:31:20.442313 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.442279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" Apr 16 16:31:20.562751 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.562719 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8bkn7"] Apr 16 16:31:20.565743 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:31:20.565712 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1519e8da_b35a_4b68_b642_2960664dd605.slice/crio-2773febd8fd238c6b6f205166d89ede0ff8050cd07f7ed6820f0e82a91ff44f7 WatchSource:0}: Error finding container 2773febd8fd238c6b6f205166d89ede0ff8050cd07f7ed6820f0e82a91ff44f7: Status 404 returned error can't find the container with id 2773febd8fd238c6b6f205166d89ede0ff8050cd07f7ed6820f0e82a91ff44f7 Apr 16 16:31:20.567499 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:20.567482 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:31:21.210531 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:21.210501 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" event={"ID":"1519e8da-b35a-4b68-b642-2960664dd605","Type":"ContainerStarted","Data":"2773febd8fd238c6b6f205166d89ede0ff8050cd07f7ed6820f0e82a91ff44f7"} Apr 16 16:31:24.221281 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:24.221243 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" event={"ID":"1519e8da-b35a-4b68-b642-2960664dd605","Type":"ContainerStarted","Data":"92c17c3889c3b69c71339b563aceb5bb5fb4439e49acda2d404782a955814e37"} Apr 16 16:31:24.221712 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:24.221312 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" Apr 16 16:31:30.229193 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:30.229163 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" Apr 16 16:31:30.246622 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:30.246567 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-8bkn7" podStartSLOduration=6.920948165 podStartE2EDuration="10.246554426s" podCreationTimestamp="2026-04-16 16:31:20 +0000 UTC" firstStartedPulling="2026-04-16 16:31:20.567642448 +0000 UTC m=+509.532422926" lastFinishedPulling="2026-04-16 16:31:23.893248708 +0000 UTC m=+512.858029187" observedRunningTime="2026-04-16 16:31:24.244248575 +0000 UTC m=+513.209029071" watchObservedRunningTime="2026-04-16 16:31:30.246554426 +0000 UTC m=+519.211334925" Apr 16 16:31:47.330368 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.330291 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-846585b969-kxngw"] Apr 16 16:31:47.335186 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.335164 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.338596 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.338575 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 16:31:47.338791 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.338773 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 16:31:47.339905 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.339888 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 16:31:47.339990 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.339914 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 16:31:47.340349 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.340333 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:31:47.343184 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.343165 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lcdzz\"" Apr 16 16:31:47.355922 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.355890 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-846585b969-kxngw"] Apr 16 16:31:47.455577 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.455537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/38a2106e-c979-4ffa-8381-3b151f24acd7-manager-config\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.455748 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.455596 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/38a2106e-c979-4ffa-8381-3b151f24acd7-metrics-cert\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.455748 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.455696 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38a2106e-c979-4ffa-8381-3b151f24acd7-cert\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.455748 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.455736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqsjj\" (UniqueName: \"kubernetes.io/projected/38a2106e-c979-4ffa-8381-3b151f24acd7-kube-api-access-hqsjj\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.556868 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.556823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38a2106e-c979-4ffa-8381-3b151f24acd7-cert\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.557068 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.556891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqsjj\" (UniqueName: \"kubernetes.io/projected/38a2106e-c979-4ffa-8381-3b151f24acd7-kube-api-access-hqsjj\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.557068 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.556946 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/38a2106e-c979-4ffa-8381-3b151f24acd7-manager-config\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.557068 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.556972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/38a2106e-c979-4ffa-8381-3b151f24acd7-metrics-cert\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.557583 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.557556 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/38a2106e-c979-4ffa-8381-3b151f24acd7-manager-config\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.559337 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.559316 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38a2106e-c979-4ffa-8381-3b151f24acd7-cert\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.559546 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.559524 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/38a2106e-c979-4ffa-8381-3b151f24acd7-metrics-cert\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.601678 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.601610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqsjj\" (UniqueName: \"kubernetes.io/projected/38a2106e-c979-4ffa-8381-3b151f24acd7-kube-api-access-hqsjj\") pod \"lws-controller-manager-846585b969-kxngw\" (UID: \"38a2106e-c979-4ffa-8381-3b151f24acd7\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.644615 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.644576 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:47.779350 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:47.779320 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-846585b969-kxngw"] Apr 16 16:31:47.782523 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:31:47.782495 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38a2106e_c979_4ffa_8381_3b151f24acd7.slice/crio-8c36c3c66ae9d28a33aada5089103aef97db32057ec4ee5acad6973f8a25ec7a WatchSource:0}: Error finding container 8c36c3c66ae9d28a33aada5089103aef97db32057ec4ee5acad6973f8a25ec7a: Status 404 returned error can't find the container with id 8c36c3c66ae9d28a33aada5089103aef97db32057ec4ee5acad6973f8a25ec7a Apr 16 16:31:48.291262 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:48.291226 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" event={"ID":"38a2106e-c979-4ffa-8381-3b151f24acd7","Type":"ContainerStarted","Data":"8c36c3c66ae9d28a33aada5089103aef97db32057ec4ee5acad6973f8a25ec7a"} Apr 16 16:31:50.298088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:50.298050 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" event={"ID":"38a2106e-c979-4ffa-8381-3b151f24acd7","Type":"ContainerStarted","Data":"5eeffc42f13f21a4a35f91e47caac7216dac249a1e13e28a80373dff54461ebc"} Apr 16 16:31:50.298483 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:50.298114 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:31:50.317239 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:31:50.317195 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" podStartSLOduration=1.011289007 podStartE2EDuration="3.317182261s" podCreationTimestamp="2026-04-16 16:31:47 +0000 UTC" firstStartedPulling="2026-04-16 16:31:47.784345757 +0000 UTC m=+536.749126233" lastFinishedPulling="2026-04-16 16:31:50.09023901 +0000 UTC m=+539.055019487" observedRunningTime="2026-04-16 16:31:50.31701397 +0000 UTC m=+539.281794465" watchObservedRunningTime="2026-04-16 16:31:50.317182261 +0000 UTC m=+539.281962759" Apr 16 16:32:01.303329 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:01.303297 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-846585b969-kxngw" Apr 16 16:32:14.853660 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.853605 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8"] Apr 16 16:32:14.856890 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.856874 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:14.859556 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.859533 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 16:32:14.859670 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.859594 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:32:14.859826 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.859810 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:32:14.860005 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.859985 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-fbvbb\"" Apr 16 16:32:14.871095 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.871074 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8"] Apr 16 16:32:14.968288 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.968256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:14.968387 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.968292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:14.968387 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.968318 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:14.968387 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.968361 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:14.968387 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.968379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:14.968547 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.968394 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:14.968547 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.968464 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:14.968547 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.968495 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:14.968547 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:14.968518 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdbsx\" (UniqueName: \"kubernetes.io/projected/981a5a65-c6e6-43dd-828e-1d3b5a580b24-kube-api-access-zdbsx\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069082 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069491 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdbsx\" (UniqueName: \"kubernetes.io/projected/981a5a65-c6e6-43dd-828e-1d3b5a580b24-kube-api-access-zdbsx\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069491 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069290 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069491 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069315 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069674 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069650 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069787 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069687 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.069900 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.070002 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069905 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.070002 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.069921 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.071946 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.071926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.072051 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.072035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.077335 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.077312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/981a5a65-c6e6-43dd-828e-1d3b5a580b24-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.077553 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.077537 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdbsx\" (UniqueName: \"kubernetes.io/projected/981a5a65-c6e6-43dd-828e-1d3b5a580b24-kube-api-access-zdbsx\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fd2h8\" (UID: \"981a5a65-c6e6-43dd-828e-1d3b5a580b24\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.168102 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.168083 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:15.289121 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.289093 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8"] Apr 16 16:32:15.291369 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:32:15.291339 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod981a5a65_c6e6_43dd_828e_1d3b5a580b24.slice/crio-c1520078c4d50314fcc747ed766064620fe7cac09e88c33b4e871a623b8a4a57 WatchSource:0}: Error finding container c1520078c4d50314fcc747ed766064620fe7cac09e88c33b4e871a623b8a4a57: Status 404 returned error can't find the container with id c1520078c4d50314fcc747ed766064620fe7cac09e88c33b4e871a623b8a4a57 Apr 16 16:32:15.377688 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:15.377659 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" event={"ID":"981a5a65-c6e6-43dd-828e-1d3b5a580b24","Type":"ContainerStarted","Data":"c1520078c4d50314fcc747ed766064620fe7cac09e88c33b4e871a623b8a4a57"} Apr 16 16:32:17.736780 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:17.736747 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 16:32:17.737055 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:17.736818 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 16:32:17.737055 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:17.736845 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 16:32:18.390735 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:18.390697 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" event={"ID":"981a5a65-c6e6-43dd-828e-1d3b5a580b24","Type":"ContainerStarted","Data":"65e3ade315b19e7456c7e15e9aa4a9ee7237f011a2323d62d168ba2eb7f00680"} Apr 16 16:32:18.415259 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:18.415209 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" podStartSLOduration=1.971892966 podStartE2EDuration="4.415194048s" podCreationTimestamp="2026-04-16 16:32:14 +0000 UTC" firstStartedPulling="2026-04-16 16:32:15.293226364 +0000 UTC m=+564.258006837" lastFinishedPulling="2026-04-16 16:32:17.736527445 +0000 UTC m=+566.701307919" observedRunningTime="2026-04-16 16:32:18.414753768 +0000 UTC m=+567.379534265" watchObservedRunningTime="2026-04-16 16:32:18.415194048 +0000 UTC m=+567.379974543" Apr 16 16:32:19.168402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:19.168376 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:19.172988 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:19.172966 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:19.393912 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:19.393889 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:19.399607 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:19.399583 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fd2h8" Apr 16 16:32:34.641494 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:34.641459 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6"] Apr 16 16:32:34.647790 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:34.647769 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6" Apr 16 16:32:34.652855 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:34.652787 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 16:32:34.652855 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:34.652838 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-ltt7x\"" Apr 16 16:32:34.653100 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:34.653083 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 16:32:34.654211 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:34.653971 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 16:32:34.657989 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:34.657971 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6"] Apr 16 16:32:34.739229 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:34.739198 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq95l\" (UniqueName: \"kubernetes.io/projected/3efab7bb-d0d9-4b54-aec8-ae9d9d4408a5-kube-api-access-nq95l\") pod \"dns-operator-controller-manager-844548ff4c-g7ll6\" (UID: \"3efab7bb-d0d9-4b54-aec8-ae9d9d4408a5\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6" Apr 16 16:32:34.840717 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:34.840671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nq95l\" (UniqueName: \"kubernetes.io/projected/3efab7bb-d0d9-4b54-aec8-ae9d9d4408a5-kube-api-access-nq95l\") pod \"dns-operator-controller-manager-844548ff4c-g7ll6\" (UID: \"3efab7bb-d0d9-4b54-aec8-ae9d9d4408a5\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6" Apr 16 16:32:34.858359 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:34.858326 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq95l\" (UniqueName: \"kubernetes.io/projected/3efab7bb-d0d9-4b54-aec8-ae9d9d4408a5-kube-api-access-nq95l\") pod \"dns-operator-controller-manager-844548ff4c-g7ll6\" (UID: \"3efab7bb-d0d9-4b54-aec8-ae9d9d4408a5\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6" Apr 16 16:32:34.957825 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:34.957794 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6" Apr 16 16:32:35.087046 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:35.087022 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6"] Apr 16 16:32:35.091361 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:32:35.091327 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3efab7bb_d0d9_4b54_aec8_ae9d9d4408a5.slice/crio-cccc7a1af832762bc0aaafd4b0e17d694413e6818c2d7ab702af3e2c5d13585d WatchSource:0}: Error finding container cccc7a1af832762bc0aaafd4b0e17d694413e6818c2d7ab702af3e2c5d13585d: Status 404 returned error can't find the container with id cccc7a1af832762bc0aaafd4b0e17d694413e6818c2d7ab702af3e2c5d13585d Apr 16 16:32:35.451128 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:35.451097 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6" event={"ID":"3efab7bb-d0d9-4b54-aec8-ae9d9d4408a5","Type":"ContainerStarted","Data":"cccc7a1af832762bc0aaafd4b0e17d694413e6818c2d7ab702af3e2c5d13585d"} Apr 16 16:32:38.463384 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:38.463349 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6" event={"ID":"3efab7bb-d0d9-4b54-aec8-ae9d9d4408a5","Type":"ContainerStarted","Data":"d5dd45f9627b97c5718a564f5c6532e3f6f5b2185071de276c2bdea813877270"} Apr 16 16:32:38.463855 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:38.463404 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6" Apr 16 16:32:43.045204 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:43.045143 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6" podStartSLOduration=6.528844616 podStartE2EDuration="9.04512849s" podCreationTimestamp="2026-04-16 16:32:34 +0000 UTC" firstStartedPulling="2026-04-16 16:32:35.094004988 +0000 UTC m=+584.058785462" lastFinishedPulling="2026-04-16 16:32:37.610288859 +0000 UTC m=+586.575069336" observedRunningTime="2026-04-16 16:32:38.486465915 +0000 UTC m=+587.451246411" watchObservedRunningTime="2026-04-16 16:32:43.04512849 +0000 UTC m=+592.009908985" Apr 16 16:32:43.046538 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:43.046513 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-wj5hp"] Apr 16 16:32:43.049732 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:43.049713 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-wj5hp" Apr 16 16:32:43.052784 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:43.052767 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-lf2g2\"" Apr 16 16:32:43.069681 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:43.069658 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-wj5hp"] Apr 16 16:32:43.111822 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:43.111791 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm2xs\" (UniqueName: \"kubernetes.io/projected/afa14995-cc29-4885-8eda-eea6e807b984-kube-api-access-xm2xs\") pod \"authorino-operator-7587b89b76-wj5hp\" (UID: \"afa14995-cc29-4885-8eda-eea6e807b984\") " pod="kuadrant-system/authorino-operator-7587b89b76-wj5hp" Apr 16 16:32:43.212532 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:43.212507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm2xs\" (UniqueName: \"kubernetes.io/projected/afa14995-cc29-4885-8eda-eea6e807b984-kube-api-access-xm2xs\") pod \"authorino-operator-7587b89b76-wj5hp\" (UID: \"afa14995-cc29-4885-8eda-eea6e807b984\") " pod="kuadrant-system/authorino-operator-7587b89b76-wj5hp" Apr 16 16:32:43.221409 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:43.221390 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm2xs\" (UniqueName: \"kubernetes.io/projected/afa14995-cc29-4885-8eda-eea6e807b984-kube-api-access-xm2xs\") pod \"authorino-operator-7587b89b76-wj5hp\" (UID: \"afa14995-cc29-4885-8eda-eea6e807b984\") " pod="kuadrant-system/authorino-operator-7587b89b76-wj5hp" Apr 16 16:32:43.359781 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:43.359720 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-wj5hp" Apr 16 16:32:43.477734 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:43.477713 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-wj5hp"] Apr 16 16:32:43.480268 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:32:43.480244 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafa14995_cc29_4885_8eda_eea6e807b984.slice/crio-6052f8845260fd85cbe3c71d6611eca02a0290ca5281559fd20306620eafaccb WatchSource:0}: Error finding container 6052f8845260fd85cbe3c71d6611eca02a0290ca5281559fd20306620eafaccb: Status 404 returned error can't find the container with id 6052f8845260fd85cbe3c71d6611eca02a0290ca5281559fd20306620eafaccb Apr 16 16:32:44.483684 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:44.483643 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-wj5hp" event={"ID":"afa14995-cc29-4885-8eda-eea6e807b984","Type":"ContainerStarted","Data":"6052f8845260fd85cbe3c71d6611eca02a0290ca5281559fd20306620eafaccb"} Apr 16 16:32:45.487994 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:45.487960 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-wj5hp" event={"ID":"afa14995-cc29-4885-8eda-eea6e807b984","Type":"ContainerStarted","Data":"b0395d58c2542dc8f796f2542aa09516d031a5affb8d4b7c8bc94424efaf19ad"} Apr 16 16:32:45.488427 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:45.488061 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-wj5hp" Apr 16 16:32:45.509136 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:45.509092 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-wj5hp" podStartSLOduration=0.881935815 podStartE2EDuration="2.509078585s" podCreationTimestamp="2026-04-16 16:32:43 +0000 UTC" firstStartedPulling="2026-04-16 16:32:43.482285186 +0000 UTC m=+592.447065661" lastFinishedPulling="2026-04-16 16:32:45.109427957 +0000 UTC m=+594.074208431" observedRunningTime="2026-04-16 16:32:45.507180982 +0000 UTC m=+594.471961478" watchObservedRunningTime="2026-04-16 16:32:45.509078585 +0000 UTC m=+594.473859080" Apr 16 16:32:49.468406 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:49.468373 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-g7ll6" Apr 16 16:32:51.542698 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:51.542668 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:32:51.542698 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:51.542700 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:32:56.493079 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:32:56.493052 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-wj5hp" Apr 16 16:33:07.256484 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.256409 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-j55fq"] Apr 16 16:33:07.265286 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.265234 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" Apr 16 16:33:07.267194 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.267166 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-j55fq"] Apr 16 16:33:07.267793 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.267773 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-szw9h\"" Apr 16 16:33:07.268778 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.268757 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 16:33:07.284136 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.284101 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-j55fq"] Apr 16 16:33:07.399034 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.398994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp9ns\" (UniqueName: \"kubernetes.io/projected/7ba5ebd8-517f-4adb-9ee2-e934b8ef4864-kube-api-access-pp9ns\") pod \"limitador-limitador-67566c68b4-j55fq\" (UID: \"7ba5ebd8-517f-4adb-9ee2-e934b8ef4864\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" Apr 16 16:33:07.399206 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.399059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/7ba5ebd8-517f-4adb-9ee2-e934b8ef4864-config-file\") pod \"limitador-limitador-67566c68b4-j55fq\" (UID: \"7ba5ebd8-517f-4adb-9ee2-e934b8ef4864\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" Apr 16 16:33:07.500102 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.500062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/7ba5ebd8-517f-4adb-9ee2-e934b8ef4864-config-file\") pod \"limitador-limitador-67566c68b4-j55fq\" (UID: \"7ba5ebd8-517f-4adb-9ee2-e934b8ef4864\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" Apr 16 16:33:07.500273 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.500159 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp9ns\" (UniqueName: \"kubernetes.io/projected/7ba5ebd8-517f-4adb-9ee2-e934b8ef4864-kube-api-access-pp9ns\") pod \"limitador-limitador-67566c68b4-j55fq\" (UID: \"7ba5ebd8-517f-4adb-9ee2-e934b8ef4864\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" Apr 16 16:33:07.500723 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.500703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/7ba5ebd8-517f-4adb-9ee2-e934b8ef4864-config-file\") pod \"limitador-limitador-67566c68b4-j55fq\" (UID: \"7ba5ebd8-517f-4adb-9ee2-e934b8ef4864\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" Apr 16 16:33:07.508766 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.508719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp9ns\" (UniqueName: \"kubernetes.io/projected/7ba5ebd8-517f-4adb-9ee2-e934b8ef4864-kube-api-access-pp9ns\") pod \"limitador-limitador-67566c68b4-j55fq\" (UID: \"7ba5ebd8-517f-4adb-9ee2-e934b8ef4864\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" Apr 16 16:33:07.576522 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.576479 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" Apr 16 16:33:07.695602 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:07.695580 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-j55fq"] Apr 16 16:33:07.698084 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:33:07.698053 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ba5ebd8_517f_4adb_9ee2_e934b8ef4864.slice/crio-c485afc3a49c0195830711d81a3983747fc492f4d0caaed02c72feec86a46cb6 WatchSource:0}: Error finding container c485afc3a49c0195830711d81a3983747fc492f4d0caaed02c72feec86a46cb6: Status 404 returned error can't find the container with id c485afc3a49c0195830711d81a3983747fc492f4d0caaed02c72feec86a46cb6 Apr 16 16:33:08.567117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:08.567083 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" event={"ID":"7ba5ebd8-517f-4adb-9ee2-e934b8ef4864","Type":"ContainerStarted","Data":"c485afc3a49c0195830711d81a3983747fc492f4d0caaed02c72feec86a46cb6"} Apr 16 16:33:12.582145 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:12.582110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" event={"ID":"7ba5ebd8-517f-4adb-9ee2-e934b8ef4864","Type":"ContainerStarted","Data":"eb1a57dc4b36ec6779c00a297d4a5ef5249c34d30850c6cb012fc03e5b0bb80e"} Apr 16 16:33:12.582555 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:12.582224 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" Apr 16 16:33:12.600419 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:12.600370 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" podStartSLOduration=1.560343423 podStartE2EDuration="5.60035604s" podCreationTimestamp="2026-04-16 16:33:07 +0000 UTC" firstStartedPulling="2026-04-16 16:33:07.699860189 +0000 UTC m=+616.664640662" lastFinishedPulling="2026-04-16 16:33:11.739872802 +0000 UTC m=+620.704653279" observedRunningTime="2026-04-16 16:33:12.598728348 +0000 UTC m=+621.563508855" watchObservedRunningTime="2026-04-16 16:33:12.60035604 +0000 UTC m=+621.565136570" Apr 16 16:33:23.586254 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:33:23.586226 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-j55fq" Apr 16 16:36:42.509369 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.509329 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42"] Apr 16 16:36:42.512243 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.512222 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.515105 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.515082 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 16:36:42.515105 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.515083 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:36:42.515278 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.515196 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:36:42.515278 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.515222 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-gqhfj\"" Apr 16 16:36:42.525036 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.525008 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42"] Apr 16 16:36:42.603860 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.603824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.603860 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.603866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.604076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.603895 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.604076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.603919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.604076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.603944 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.604076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.603962 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.604076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.604064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.604301 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.604095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.604301 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.604120 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnd2l\" (UniqueName: \"kubernetes.io/projected/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-kube-api-access-rnd2l\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705040 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705001 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705040 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705039 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705272 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnd2l\" (UniqueName: \"kubernetes.io/projected/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-kube-api-access-rnd2l\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705272 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705272 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705155 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705272 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705202 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705272 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705224 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705272 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705573 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705573 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705573 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705743 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705718 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705825 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705808 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.705956 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.705937 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.707577 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.707554 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.707773 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.707757 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.713594 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.713566 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnd2l\" (UniqueName: \"kubernetes.io/projected/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-kube-api-access-rnd2l\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.713594 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.713575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aab3a47b-ea57-4dc4-ba01-6801db77b1e4-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-vfv42\" (UID: \"aab3a47b-ea57-4dc4-ba01-6801db77b1e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.823705 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.823597 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:42.955340 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.955310 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42"] Apr 16 16:36:42.958027 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:36:42.957984 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaab3a47b_ea57_4dc4_ba01_6801db77b1e4.slice/crio-592e841e125f2f7466e19f811335279b3f6f6c33760f34553496a5cfc1b94188 WatchSource:0}: Error finding container 592e841e125f2f7466e19f811335279b3f6f6c33760f34553496a5cfc1b94188: Status 404 returned error can't find the container with id 592e841e125f2f7466e19f811335279b3f6f6c33760f34553496a5cfc1b94188 Apr 16 16:36:42.960078 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.960061 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:36:42.960393 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.960367 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 16:36:42.960491 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.960434 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 16:36:42.960546 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:42.960489 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 16:36:43.264734 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:43.264702 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" event={"ID":"aab3a47b-ea57-4dc4-ba01-6801db77b1e4","Type":"ContainerStarted","Data":"20b69bd24edb67fd267d56851a09062ac575c863832260641d927220f8cb4c6a"} Apr 16 16:36:43.264734 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:43.264737 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" event={"ID":"aab3a47b-ea57-4dc4-ba01-6801db77b1e4","Type":"ContainerStarted","Data":"592e841e125f2f7466e19f811335279b3f6f6c33760f34553496a5cfc1b94188"} Apr 16 16:36:43.288432 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:43.288246 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" podStartSLOduration=1.288224284 podStartE2EDuration="1.288224284s" podCreationTimestamp="2026-04-16 16:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:36:43.28554056 +0000 UTC m=+832.250321056" watchObservedRunningTime="2026-04-16 16:36:43.288224284 +0000 UTC m=+832.253004784" Apr 16 16:36:43.823868 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:43.823828 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:44.828988 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:44.828960 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:45.274167 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:45.274130 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:36:45.275240 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:36:45.275220 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-vfv42" Apr 16 16:37:12.476921 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.476889 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx"] Apr 16 16:37:12.480074 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.480052 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.483456 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.483425 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 16:37:12.483571 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.483461 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4ddlh\"" Apr 16 16:37:12.491026 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.491003 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx"] Apr 16 16:37:12.675261 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.675216 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.675261 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.675265 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.675524 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.675352 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3679c3c0-c4b5-497b-9680-eea20721f9c7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.675524 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.675414 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.675524 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.675485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.675524 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.675508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkxmf\" (UniqueName: \"kubernetes.io/projected/3679c3c0-c4b5-497b-9680-eea20721f9c7-kube-api-access-nkxmf\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.776963 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.776875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3679c3c0-c4b5-497b-9680-eea20721f9c7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.776963 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.776939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.777197 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.776971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.777197 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.776990 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkxmf\" (UniqueName: \"kubernetes.io/projected/3679c3c0-c4b5-497b-9680-eea20721f9c7-kube-api-access-nkxmf\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.777197 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.777021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.777197 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.777041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.777727 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.777700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.777831 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.777726 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.777831 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.777748 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.779456 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.779416 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.779662 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.779644 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3679c3c0-c4b5-497b-9680-eea20721f9c7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.786436 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.786413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkxmf\" (UniqueName: \"kubernetes.io/projected/3679c3c0-c4b5-497b-9680-eea20721f9c7-kube-api-access-nkxmf\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.791189 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.791166 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:12.918044 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:12.918019 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx"] Apr 16 16:37:12.919971 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:37:12.919936 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3679c3c0_c4b5_497b_9680_eea20721f9c7.slice/crio-48b5e68c3d5285387f0d86e9ef94c67ad838f1655285567e285c24af8f5271af WatchSource:0}: Error finding container 48b5e68c3d5285387f0d86e9ef94c67ad838f1655285567e285c24af8f5271af: Status 404 returned error can't find the container with id 48b5e68c3d5285387f0d86e9ef94c67ad838f1655285567e285c24af8f5271af Apr 16 16:37:13.381537 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:13.381495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" event={"ID":"3679c3c0-c4b5-497b-9680-eea20721f9c7","Type":"ContainerStarted","Data":"48b5e68c3d5285387f0d86e9ef94c67ad838f1655285567e285c24af8f5271af"} Apr 16 16:37:17.399050 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:17.399005 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" event={"ID":"3679c3c0-c4b5-497b-9680-eea20721f9c7","Type":"ContainerStarted","Data":"e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882"} Apr 16 16:37:21.423948 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:21.423912 2577 generic.go:358] "Generic (PLEG): container finished" podID="3679c3c0-c4b5-497b-9680-eea20721f9c7" containerID="e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882" exitCode=0 Apr 16 16:37:21.424374 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:21.423987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" event={"ID":"3679c3c0-c4b5-497b-9680-eea20721f9c7","Type":"ContainerDied","Data":"e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882"} Apr 16 16:37:24.436200 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:24.436166 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" event={"ID":"3679c3c0-c4b5-497b-9680-eea20721f9c7","Type":"ContainerStarted","Data":"58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f"} Apr 16 16:37:24.457165 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:24.457114 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" podStartSLOduration=1.359279218 podStartE2EDuration="12.457100308s" podCreationTimestamp="2026-04-16 16:37:12 +0000 UTC" firstStartedPulling="2026-04-16 16:37:12.921804701 +0000 UTC m=+861.886585179" lastFinishedPulling="2026-04-16 16:37:24.019625791 +0000 UTC m=+872.984406269" observedRunningTime="2026-04-16 16:37:24.454825707 +0000 UTC m=+873.419606203" watchObservedRunningTime="2026-04-16 16:37:24.457100308 +0000 UTC m=+873.421880804" Apr 16 16:37:32.792306 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:32.792267 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:32.792810 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:32.792318 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:32.804926 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:32.804902 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:33.478264 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:33.478235 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:37:51.574505 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:51.574379 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:37:51.575128 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:37:51.575107 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:38:05.158373 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.158342 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx"] Apr 16 16:38:05.158873 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.158645 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" podUID="3679c3c0-c4b5-497b-9680-eea20721f9c7" containerName="main" containerID="cri-o://58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f" gracePeriod=30 Apr 16 16:38:05.401989 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.401966 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:38:05.447309 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.447239 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkxmf\" (UniqueName: \"kubernetes.io/projected/3679c3c0-c4b5-497b-9680-eea20721f9c7-kube-api-access-nkxmf\") pod \"3679c3c0-c4b5-497b-9680-eea20721f9c7\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " Apr 16 16:38:05.447434 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.447323 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-kserve-provision-location\") pod \"3679c3c0-c4b5-497b-9680-eea20721f9c7\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " Apr 16 16:38:05.447434 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.447342 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-model-cache\") pod \"3679c3c0-c4b5-497b-9680-eea20721f9c7\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " Apr 16 16:38:05.447434 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.447406 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-home\") pod \"3679c3c0-c4b5-497b-9680-eea20721f9c7\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " Apr 16 16:38:05.447434 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.447430 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-dshm\") pod \"3679c3c0-c4b5-497b-9680-eea20721f9c7\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " Apr 16 16:38:05.447659 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.447496 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3679c3c0-c4b5-497b-9680-eea20721f9c7-tls-certs\") pod \"3679c3c0-c4b5-497b-9680-eea20721f9c7\" (UID: \"3679c3c0-c4b5-497b-9680-eea20721f9c7\") " Apr 16 16:38:05.447746 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.447706 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-home" (OuterVolumeSpecName: "home") pod "3679c3c0-c4b5-497b-9680-eea20721f9c7" (UID: "3679c3c0-c4b5-497b-9680-eea20721f9c7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:05.447746 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.447719 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-model-cache" (OuterVolumeSpecName: "model-cache") pod "3679c3c0-c4b5-497b-9680-eea20721f9c7" (UID: "3679c3c0-c4b5-497b-9680-eea20721f9c7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:05.449671 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.449633 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3679c3c0-c4b5-497b-9680-eea20721f9c7-kube-api-access-nkxmf" (OuterVolumeSpecName: "kube-api-access-nkxmf") pod "3679c3c0-c4b5-497b-9680-eea20721f9c7" (UID: "3679c3c0-c4b5-497b-9680-eea20721f9c7"). InnerVolumeSpecName "kube-api-access-nkxmf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:38:05.450011 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.449985 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-dshm" (OuterVolumeSpecName: "dshm") pod "3679c3c0-c4b5-497b-9680-eea20721f9c7" (UID: "3679c3c0-c4b5-497b-9680-eea20721f9c7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:05.450078 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.450023 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3679c3c0-c4b5-497b-9680-eea20721f9c7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3679c3c0-c4b5-497b-9680-eea20721f9c7" (UID: "3679c3c0-c4b5-497b-9680-eea20721f9c7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:38:05.501338 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.501298 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3679c3c0-c4b5-497b-9680-eea20721f9c7" (UID: "3679c3c0-c4b5-497b-9680-eea20721f9c7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:05.548356 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.548326 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:38:05.548356 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.548352 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:38:05.548570 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.548364 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:38:05.548570 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.548375 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3679c3c0-c4b5-497b-9680-eea20721f9c7-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:38:05.548570 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.548386 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3679c3c0-c4b5-497b-9680-eea20721f9c7-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:38:05.548570 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.548397 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nkxmf\" (UniqueName: \"kubernetes.io/projected/3679c3c0-c4b5-497b-9680-eea20721f9c7-kube-api-access-nkxmf\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:38:05.576007 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.575976 2577 generic.go:358] "Generic (PLEG): container finished" podID="3679c3c0-c4b5-497b-9680-eea20721f9c7" containerID="58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f" exitCode=0 Apr 16 16:38:05.576148 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.576047 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" Apr 16 16:38:05.576148 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.576067 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" event={"ID":"3679c3c0-c4b5-497b-9680-eea20721f9c7","Type":"ContainerDied","Data":"58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f"} Apr 16 16:38:05.576148 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.576117 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx" event={"ID":"3679c3c0-c4b5-497b-9680-eea20721f9c7","Type":"ContainerDied","Data":"48b5e68c3d5285387f0d86e9ef94c67ad838f1655285567e285c24af8f5271af"} Apr 16 16:38:05.576148 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.576137 2577 scope.go:117] "RemoveContainer" containerID="58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f" Apr 16 16:38:05.587339 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.587322 2577 scope.go:117] "RemoveContainer" containerID="e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882" Apr 16 16:38:05.599037 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.599013 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx"] Apr 16 16:38:05.600613 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.600591 2577 scope.go:117] "RemoveContainer" containerID="58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f" Apr 16 16:38:05.600878 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:38:05.600855 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f\": container with ID starting with 58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f not found: ID does not exist" containerID="58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f" Apr 16 16:38:05.600975 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.600888 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f"} err="failed to get container status \"58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f\": rpc error: code = NotFound desc = could not find container \"58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f\": container with ID starting with 58d850d87fcc82457976ff2f04625477527a718931dcf40cd6401c5e1da84a3f not found: ID does not exist" Apr 16 16:38:05.600975 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.600907 2577 scope.go:117] "RemoveContainer" containerID="e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882" Apr 16 16:38:05.601170 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:38:05.601138 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882\": container with ID starting with e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882 not found: ID does not exist" containerID="e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882" Apr 16 16:38:05.601244 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.601174 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882"} err="failed to get container status \"e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882\": rpc error: code = NotFound desc = could not find container \"e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882\": container with ID starting with e0681013f7b2950a22d6e9a7a58e9b216fb977525553f0f36b444293b76f4882 not found: ID does not exist" Apr 16 16:38:05.602265 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.602243 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5d687994f-pdkfx"] Apr 16 16:38:05.650611 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:05.650580 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3679c3c0-c4b5-497b-9680-eea20721f9c7" path="/var/lib/kubelet/pods/3679c3c0-c4b5-497b-9680-eea20721f9c7/volumes" Apr 16 16:38:16.169880 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.169842 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc"] Apr 16 16:38:16.170358 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.170237 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3679c3c0-c4b5-497b-9680-eea20721f9c7" containerName="main" Apr 16 16:38:16.170358 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.170250 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3679c3c0-c4b5-497b-9680-eea20721f9c7" containerName="main" Apr 16 16:38:16.170358 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.170261 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3679c3c0-c4b5-497b-9680-eea20721f9c7" containerName="storage-initializer" Apr 16 16:38:16.170358 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.170267 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3679c3c0-c4b5-497b-9680-eea20721f9c7" containerName="storage-initializer" Apr 16 16:38:16.170358 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.170332 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3679c3c0-c4b5-497b-9680-eea20721f9c7" containerName="main" Apr 16 16:38:16.173604 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.173580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.176184 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.176155 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 16:38:16.176561 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.176537 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4ddlh\"" Apr 16 16:38:16.184277 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.184249 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc"] Apr 16 16:38:16.241270 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.241230 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf7js\" (UniqueName: \"kubernetes.io/projected/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kube-api-access-qf7js\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.241477 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.241282 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.241477 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.241301 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.241477 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.241366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.241477 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.241432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.241626 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.241482 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.342247 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.342216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.342461 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.342259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.342461 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.342305 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qf7js\" (UniqueName: \"kubernetes.io/projected/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kube-api-access-qf7js\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.342461 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.342348 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.342461 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.342371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.342461 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.342413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.342729 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.342704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.342783 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.342730 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.342838 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.342810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.344751 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.344701 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.345086 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.345067 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.352204 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.352178 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf7js\" (UniqueName: \"kubernetes.io/projected/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kube-api-access-qf7js\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.485843 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.485755 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:16.618595 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:16.618563 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc"] Apr 16 16:38:16.621505 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:38:16.621470 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a25ec7f_e571_4fe9_990c_beb3e63e92ea.slice/crio-eb539abec329764d883131b1820dc113261a7625ba17e51d2034e2f43e9fb3b6 WatchSource:0}: Error finding container eb539abec329764d883131b1820dc113261a7625ba17e51d2034e2f43e9fb3b6: Status 404 returned error can't find the container with id eb539abec329764d883131b1820dc113261a7625ba17e51d2034e2f43e9fb3b6 Apr 16 16:38:17.622193 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:17.622158 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" event={"ID":"4a25ec7f-e571-4fe9-990c-beb3e63e92ea","Type":"ContainerStarted","Data":"5bcd44824d2a0c8d8f96da490bcf7832fabbe4348977b8878f4cddb1248dafac"} Apr 16 16:38:17.622633 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:17.622206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" event={"ID":"4a25ec7f-e571-4fe9-990c-beb3e63e92ea","Type":"ContainerStarted","Data":"eb539abec329764d883131b1820dc113261a7625ba17e51d2034e2f43e9fb3b6"} Apr 16 16:38:21.640423 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:21.640387 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerID="5bcd44824d2a0c8d8f96da490bcf7832fabbe4348977b8878f4cddb1248dafac" exitCode=0 Apr 16 16:38:21.640869 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:21.640465 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" event={"ID":"4a25ec7f-e571-4fe9-990c-beb3e63e92ea","Type":"ContainerDied","Data":"5bcd44824d2a0c8d8f96da490bcf7832fabbe4348977b8878f4cddb1248dafac"} Apr 16 16:38:37.359556 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.359523 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt"] Apr 16 16:38:37.452038 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.451998 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt"] Apr 16 16:38:37.452200 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.452168 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.455108 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.455084 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 16:38:37.551679 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.551538 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.551679 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.551653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90535920-0ed9-4e33-b899-e1e4314574fd-tls-certs\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.551912 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.551764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-dshm\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.551912 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.551839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-model-cache\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.551912 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.551866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-home\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.551912 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.551891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b67gw\" (UniqueName: \"kubernetes.io/projected/90535920-0ed9-4e33-b899-e1e4314574fd-kube-api-access-b67gw\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.652529 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.652434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90535920-0ed9-4e33-b899-e1e4314574fd-tls-certs\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.652529 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.652500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-dshm\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.652763 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.652543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-model-cache\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.652763 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.652560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-home\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.652763 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.652584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b67gw\" (UniqueName: \"kubernetes.io/projected/90535920-0ed9-4e33-b899-e1e4314574fd-kube-api-access-b67gw\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.652763 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.652621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.653018 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.652994 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.653097 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.653029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-model-cache\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.653160 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.653103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-home\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.655292 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.655267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90535920-0ed9-4e33-b899-e1e4314574fd-tls-certs\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.655663 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.655641 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-dshm\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.664215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.664189 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b67gw\" (UniqueName: \"kubernetes.io/projected/90535920-0ed9-4e33-b899-e1e4314574fd-kube-api-access-b67gw\") pod \"precise-prefix-cache-test-kserve-554b968997-qxqzt\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:37.764116 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:37.764086 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:48.063795 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:48.063712 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt"] Apr 16 16:38:48.066171 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:38:48.066141 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90535920_0ed9_4e33_b899_e1e4314574fd.slice/crio-9df6c8b1c7ad9fad98fedcac3473571a4198ae75ce16f75b71273b1a9caf5102 WatchSource:0}: Error finding container 9df6c8b1c7ad9fad98fedcac3473571a4198ae75ce16f75b71273b1a9caf5102: Status 404 returned error can't find the container with id 9df6c8b1c7ad9fad98fedcac3473571a4198ae75ce16f75b71273b1a9caf5102 Apr 16 16:38:48.772504 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:48.772439 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" event={"ID":"90535920-0ed9-4e33-b899-e1e4314574fd","Type":"ContainerStarted","Data":"fc9599c5142a258780bfb1ce42b17249695292cda51fd475a734a79c1367f99c"} Apr 16 16:38:48.772504 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:48.772507 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" event={"ID":"90535920-0ed9-4e33-b899-e1e4314574fd","Type":"ContainerStarted","Data":"9df6c8b1c7ad9fad98fedcac3473571a4198ae75ce16f75b71273b1a9caf5102"} Apr 16 16:38:48.774279 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:48.774244 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" event={"ID":"4a25ec7f-e571-4fe9-990c-beb3e63e92ea","Type":"ContainerStarted","Data":"4db9f7d3890930c3de9cdd962a57e278e6347595868f7a2293b76fae0fe0b6d1"} Apr 16 16:38:48.810361 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:48.810287 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" podStartSLOduration=6.106173727 podStartE2EDuration="32.810268324s" podCreationTimestamp="2026-04-16 16:38:16 +0000 UTC" firstStartedPulling="2026-04-16 16:38:21.641694908 +0000 UTC m=+930.606475382" lastFinishedPulling="2026-04-16 16:38:48.345789504 +0000 UTC m=+957.310569979" observedRunningTime="2026-04-16 16:38:48.808401105 +0000 UTC m=+957.773181614" watchObservedRunningTime="2026-04-16 16:38:48.810268324 +0000 UTC m=+957.775048821" Apr 16 16:38:52.791170 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:52.791136 2577 generic.go:358] "Generic (PLEG): container finished" podID="90535920-0ed9-4e33-b899-e1e4314574fd" containerID="fc9599c5142a258780bfb1ce42b17249695292cda51fd475a734a79c1367f99c" exitCode=0 Apr 16 16:38:52.791580 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:52.791214 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" event={"ID":"90535920-0ed9-4e33-b899-e1e4314574fd","Type":"ContainerDied","Data":"fc9599c5142a258780bfb1ce42b17249695292cda51fd475a734a79c1367f99c"} Apr 16 16:38:53.796384 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:53.796353 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" event={"ID":"90535920-0ed9-4e33-b899-e1e4314574fd","Type":"ContainerStarted","Data":"9e5bcc13a13066ec02e00fcaaf2d6375ac145b42133813836755113b797bc2a3"} Apr 16 16:38:53.816943 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:53.816897 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" podStartSLOduration=16.816880994 podStartE2EDuration="16.816880994s" podCreationTimestamp="2026-04-16 16:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:38:53.815622008 +0000 UTC m=+962.780402505" watchObservedRunningTime="2026-04-16 16:38:53.816880994 +0000 UTC m=+962.781661490" Apr 16 16:38:56.486374 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:56.486338 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:56.486374 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:56.486374 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:38:56.488002 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:56.487970 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 16 16:38:57.764723 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:57.764682 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:57.764723 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:57.764723 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:57.778156 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:57.778123 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:38:57.822852 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:38:57.822825 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:39:06.486936 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:06.486882 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 16 16:39:16.486601 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:16.486508 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 16 16:39:26.486572 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:26.486523 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 16 16:39:29.682070 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:29.682039 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt"] Apr 16 16:39:29.684550 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:29.682382 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" podUID="90535920-0ed9-4e33-b899-e1e4314574fd" containerName="main" containerID="cri-o://9e5bcc13a13066ec02e00fcaaf2d6375ac145b42133813836755113b797bc2a3" gracePeriod=30 Apr 16 16:39:29.928162 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:29.928132 2577 generic.go:358] "Generic (PLEG): container finished" podID="90535920-0ed9-4e33-b899-e1e4314574fd" containerID="9e5bcc13a13066ec02e00fcaaf2d6375ac145b42133813836755113b797bc2a3" exitCode=0 Apr 16 16:39:29.928290 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:29.928202 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" event={"ID":"90535920-0ed9-4e33-b899-e1e4314574fd","Type":"ContainerDied","Data":"9e5bcc13a13066ec02e00fcaaf2d6375ac145b42133813836755113b797bc2a3"} Apr 16 16:39:29.947565 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:29.947542 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:39:30.036939 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.036906 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-kserve-provision-location\") pod \"90535920-0ed9-4e33-b899-e1e4314574fd\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " Apr 16 16:39:30.036939 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.036945 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90535920-0ed9-4e33-b899-e1e4314574fd-tls-certs\") pod \"90535920-0ed9-4e33-b899-e1e4314574fd\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " Apr 16 16:39:30.037178 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.037005 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-model-cache\") pod \"90535920-0ed9-4e33-b899-e1e4314574fd\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " Apr 16 16:39:30.037178 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.037028 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-dshm\") pod \"90535920-0ed9-4e33-b899-e1e4314574fd\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " Apr 16 16:39:30.037178 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.037050 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-home\") pod \"90535920-0ed9-4e33-b899-e1e4314574fd\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " Apr 16 16:39:30.037178 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.037073 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b67gw\" (UniqueName: \"kubernetes.io/projected/90535920-0ed9-4e33-b899-e1e4314574fd-kube-api-access-b67gw\") pod \"90535920-0ed9-4e33-b899-e1e4314574fd\" (UID: \"90535920-0ed9-4e33-b899-e1e4314574fd\") " Apr 16 16:39:30.037432 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.037315 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-model-cache" (OuterVolumeSpecName: "model-cache") pod "90535920-0ed9-4e33-b899-e1e4314574fd" (UID: "90535920-0ed9-4e33-b899-e1e4314574fd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:39:30.037432 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.037370 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-home" (OuterVolumeSpecName: "home") pod "90535920-0ed9-4e33-b899-e1e4314574fd" (UID: "90535920-0ed9-4e33-b899-e1e4314574fd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:39:30.039492 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.039423 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90535920-0ed9-4e33-b899-e1e4314574fd-kube-api-access-b67gw" (OuterVolumeSpecName: "kube-api-access-b67gw") pod "90535920-0ed9-4e33-b899-e1e4314574fd" (UID: "90535920-0ed9-4e33-b899-e1e4314574fd"). InnerVolumeSpecName "kube-api-access-b67gw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:39:30.039691 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.039664 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90535920-0ed9-4e33-b899-e1e4314574fd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "90535920-0ed9-4e33-b899-e1e4314574fd" (UID: "90535920-0ed9-4e33-b899-e1e4314574fd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:39:30.039754 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.039686 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-dshm" (OuterVolumeSpecName: "dshm") pod "90535920-0ed9-4e33-b899-e1e4314574fd" (UID: "90535920-0ed9-4e33-b899-e1e4314574fd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:39:30.094072 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.094028 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "90535920-0ed9-4e33-b899-e1e4314574fd" (UID: "90535920-0ed9-4e33-b899-e1e4314574fd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:39:30.138026 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.137991 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.138026 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.138022 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.138026 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.138033 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.138262 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.138047 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b67gw\" (UniqueName: \"kubernetes.io/projected/90535920-0ed9-4e33-b899-e1e4314574fd-kube-api-access-b67gw\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.138262 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.138060 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90535920-0ed9-4e33-b899-e1e4314574fd-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.138262 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.138072 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90535920-0ed9-4e33-b899-e1e4314574fd-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.932864 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.932831 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" event={"ID":"90535920-0ed9-4e33-b899-e1e4314574fd","Type":"ContainerDied","Data":"9df6c8b1c7ad9fad98fedcac3473571a4198ae75ce16f75b71273b1a9caf5102"} Apr 16 16:39:30.933296 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.932879 2577 scope.go:117] "RemoveContainer" containerID="9e5bcc13a13066ec02e00fcaaf2d6375ac145b42133813836755113b797bc2a3" Apr 16 16:39:30.933296 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.932920 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt" Apr 16 16:39:30.942165 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.942144 2577 scope.go:117] "RemoveContainer" containerID="fc9599c5142a258780bfb1ce42b17249695292cda51fd475a734a79c1367f99c" Apr 16 16:39:30.958804 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.958774 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt"] Apr 16 16:39:30.962691 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:30.962666 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-qxqzt"] Apr 16 16:39:31.654076 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:31.654034 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90535920-0ed9-4e33-b899-e1e4314574fd" path="/var/lib/kubelet/pods/90535920-0ed9-4e33-b899-e1e4314574fd/volumes" Apr 16 16:39:36.486630 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:36.486588 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 16 16:39:45.176626 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.176593 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2"] Apr 16 16:39:45.177238 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.177219 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90535920-0ed9-4e33-b899-e1e4314574fd" containerName="storage-initializer" Apr 16 16:39:45.177321 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.177241 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="90535920-0ed9-4e33-b899-e1e4314574fd" containerName="storage-initializer" Apr 16 16:39:45.177321 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.177263 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90535920-0ed9-4e33-b899-e1e4314574fd" containerName="main" Apr 16 16:39:45.177321 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.177272 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="90535920-0ed9-4e33-b899-e1e4314574fd" containerName="main" Apr 16 16:39:45.177524 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.177354 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="90535920-0ed9-4e33-b899-e1e4314574fd" containerName="main" Apr 16 16:39:45.180821 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.180800 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.183574 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.183550 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 16:39:45.191700 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.191675 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2"] Apr 16 16:39:45.281192 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.281156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-model-cache\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.281386 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.281208 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpsgw\" (UniqueName: \"kubernetes.io/projected/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kube-api-access-bpsgw\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.281386 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.281306 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kserve-provision-location\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.281386 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.281358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.281543 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.281402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-home\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.281543 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.281427 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-dshm\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.382891 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.382850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kserve-provision-location\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.382891 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.382902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.383156 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.382938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-home\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.383156 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.382965 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-dshm\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.383156 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.383038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-model-cache\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.383156 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.383078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpsgw\" (UniqueName: \"kubernetes.io/projected/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kube-api-access-bpsgw\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.383427 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.383374 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-home\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.383591 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.383438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-model-cache\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.383591 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.383483 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kserve-provision-location\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.385466 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.385413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-dshm\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.385693 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.385669 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.393599 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.393572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpsgw\" (UniqueName: \"kubernetes.io/projected/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kube-api-access-bpsgw\") pod \"stop-feature-test-kserve-5f6c45bb9b-pnzp2\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.493222 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.493128 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:45.631653 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.631608 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2"] Apr 16 16:39:45.634046 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:39:45.634017 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod925b12bd_cb2e_4cdb_90ff_b5265260bfee.slice/crio-52a82c215570b50e8b369fb52a27940606bef2c07f80cbd664154e0e2a659240 WatchSource:0}: Error finding container 52a82c215570b50e8b369fb52a27940606bef2c07f80cbd664154e0e2a659240: Status 404 returned error can't find the container with id 52a82c215570b50e8b369fb52a27940606bef2c07f80cbd664154e0e2a659240 Apr 16 16:39:45.987275 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.987230 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" event={"ID":"925b12bd-cb2e-4cdb-90ff-b5265260bfee","Type":"ContainerStarted","Data":"07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217"} Apr 16 16:39:45.987275 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:45.987277 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" event={"ID":"925b12bd-cb2e-4cdb-90ff-b5265260bfee","Type":"ContainerStarted","Data":"52a82c215570b50e8b369fb52a27940606bef2c07f80cbd664154e0e2a659240"} Apr 16 16:39:46.486872 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:46.486827 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 16 16:39:50.012731 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:50.012693 2577 generic.go:358] "Generic (PLEG): container finished" podID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerID="07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217" exitCode=0 Apr 16 16:39:50.013113 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:50.012764 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" event={"ID":"925b12bd-cb2e-4cdb-90ff-b5265260bfee","Type":"ContainerDied","Data":"07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217"} Apr 16 16:39:51.018291 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:51.018251 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" event={"ID":"925b12bd-cb2e-4cdb-90ff-b5265260bfee","Type":"ContainerStarted","Data":"30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2"} Apr 16 16:39:51.040045 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:51.039968 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podStartSLOduration=6.039945812 podStartE2EDuration="6.039945812s" podCreationTimestamp="2026-04-16 16:39:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:39:51.038216503 +0000 UTC m=+1020.002997004" watchObservedRunningTime="2026-04-16 16:39:51.039945812 +0000 UTC m=+1020.004726309" Apr 16 16:39:55.493556 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:55.493517 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:55.493556 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:55.493562 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:39:55.495263 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:55.495232 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 16:39:56.487160 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:39:56.487108 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 16 16:40:05.493813 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:05.493761 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 16:40:06.486246 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:06.486194 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 16 16:40:15.494031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:15.493968 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 16:40:16.487133 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:16.487079 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 16 16:40:25.493874 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:25.493827 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 16:40:26.496543 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:26.496508 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:40:26.504936 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:26.504902 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:40:33.081706 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:33.081674 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc"] Apr 16 16:40:33.082178 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:33.082058 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" containerID="cri-o://4db9f7d3890930c3de9cdd962a57e278e6347595868f7a2293b76fae0fe0b6d1" gracePeriod=30 Apr 16 16:40:35.494637 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:35.494579 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 16:40:39.969752 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:39.969676 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj"] Apr 16 16:40:39.973042 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:39.973024 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:39.975604 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:39.975581 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 16:40:39.983200 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:39.983172 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj"] Apr 16 16:40:40.108360 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.108315 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-home\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.108562 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.108390 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/28faf7fa-6d32-4452-a411-1a6061173dae-tls-certs\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.108562 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.108424 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.108562 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.108544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-model-cache\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.108695 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.108612 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8tw\" (UniqueName: \"kubernetes.io/projected/28faf7fa-6d32-4452-a411-1a6061173dae-kube-api-access-nh8tw\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.108695 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.108650 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-dshm\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.209846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.209797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8tw\" (UniqueName: \"kubernetes.io/projected/28faf7fa-6d32-4452-a411-1a6061173dae-kube-api-access-nh8tw\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.210026 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.209863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-dshm\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.210026 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.209934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-home\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.210026 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.209979 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/28faf7fa-6d32-4452-a411-1a6061173dae-tls-certs\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.210026 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.210005 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.210208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.210062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-model-cache\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.210440 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.210414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-home\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.210440 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.210435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.210619 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.210517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-model-cache\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.212774 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.212749 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-dshm\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.213015 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.212997 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/28faf7fa-6d32-4452-a411-1a6061173dae-tls-certs\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.220558 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.220490 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8tw\" (UniqueName: \"kubernetes.io/projected/28faf7fa-6d32-4452-a411-1a6061173dae-kube-api-access-nh8tw\") pod \"custom-route-timeout-test-kserve-786fb6d9fb-f28dj\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.286607 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.286563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:40.437840 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:40.437703 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj"] Apr 16 16:40:40.440964 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:40:40.440932 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28faf7fa_6d32_4452_a411_1a6061173dae.slice/crio-50b18c5b092a0c8ec15b94c5a3f755da703b16b7b9d2bb2b554f8c58c27aeff8 WatchSource:0}: Error finding container 50b18c5b092a0c8ec15b94c5a3f755da703b16b7b9d2bb2b554f8c58c27aeff8: Status 404 returned error can't find the container with id 50b18c5b092a0c8ec15b94c5a3f755da703b16b7b9d2bb2b554f8c58c27aeff8 Apr 16 16:40:41.208608 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:41.208558 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" event={"ID":"28faf7fa-6d32-4452-a411-1a6061173dae","Type":"ContainerStarted","Data":"606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5"} Apr 16 16:40:41.208608 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:41.208613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" event={"ID":"28faf7fa-6d32-4452-a411-1a6061173dae","Type":"ContainerStarted","Data":"50b18c5b092a0c8ec15b94c5a3f755da703b16b7b9d2bb2b554f8c58c27aeff8"} Apr 16 16:40:45.224650 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:45.224615 2577 generic.go:358] "Generic (PLEG): container finished" podID="28faf7fa-6d32-4452-a411-1a6061173dae" containerID="606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5" exitCode=0 Apr 16 16:40:45.225028 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:45.224689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" event={"ID":"28faf7fa-6d32-4452-a411-1a6061173dae","Type":"ContainerDied","Data":"606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5"} Apr 16 16:40:45.493912 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:45.493815 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 16:40:46.230056 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:46.230015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" event={"ID":"28faf7fa-6d32-4452-a411-1a6061173dae","Type":"ContainerStarted","Data":"c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc"} Apr 16 16:40:46.255066 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:46.255005 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" podStartSLOduration=7.254984146 podStartE2EDuration="7.254984146s" podCreationTimestamp="2026-04-16 16:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:40:46.252542804 +0000 UTC m=+1075.217323300" watchObservedRunningTime="2026-04-16 16:40:46.254984146 +0000 UTC m=+1075.219764646" Apr 16 16:40:50.287238 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:50.287200 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:50.287671 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:50.287391 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:40:50.288821 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:50.288792 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 16:40:55.494600 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:40:55.494553 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 16:41:00.287931 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:00.287883 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 16:41:03.300470 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.300214 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc_4a25ec7f-e571-4fe9-990c-beb3e63e92ea/main/0.log" Apr 16 16:41:03.300951 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.300649 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerID="4db9f7d3890930c3de9cdd962a57e278e6347595868f7a2293b76fae0fe0b6d1" exitCode=137 Apr 16 16:41:03.300951 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.300801 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" event={"ID":"4a25ec7f-e571-4fe9-990c-beb3e63e92ea","Type":"ContainerDied","Data":"4db9f7d3890930c3de9cdd962a57e278e6347595868f7a2293b76fae0fe0b6d1"} Apr 16 16:41:03.357937 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.357910 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc_4a25ec7f-e571-4fe9-990c-beb3e63e92ea/main/0.log" Apr 16 16:41:03.358353 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.358333 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:41:03.439638 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.439604 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-dshm\") pod \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " Apr 16 16:41:03.439832 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.439644 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-tls-certs\") pod \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " Apr 16 16:41:03.439832 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.439669 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-model-cache\") pod \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " Apr 16 16:41:03.439832 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.439705 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-home\") pod \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " Apr 16 16:41:03.439832 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.439748 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf7js\" (UniqueName: \"kubernetes.io/projected/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kube-api-access-qf7js\") pod \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " Apr 16 16:41:03.439832 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.439783 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kserve-provision-location\") pod \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\" (UID: \"4a25ec7f-e571-4fe9-990c-beb3e63e92ea\") " Apr 16 16:41:03.440094 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.439931 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-model-cache" (OuterVolumeSpecName: "model-cache") pod "4a25ec7f-e571-4fe9-990c-beb3e63e92ea" (UID: "4a25ec7f-e571-4fe9-990c-beb3e63e92ea"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:41:03.440094 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.440049 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:41:03.440201 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.440149 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-home" (OuterVolumeSpecName: "home") pod "4a25ec7f-e571-4fe9-990c-beb3e63e92ea" (UID: "4a25ec7f-e571-4fe9-990c-beb3e63e92ea"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:41:03.442141 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.442102 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-dshm" (OuterVolumeSpecName: "dshm") pod "4a25ec7f-e571-4fe9-990c-beb3e63e92ea" (UID: "4a25ec7f-e571-4fe9-990c-beb3e63e92ea"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:41:03.442408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.442390 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4a25ec7f-e571-4fe9-990c-beb3e63e92ea" (UID: "4a25ec7f-e571-4fe9-990c-beb3e63e92ea"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:41:03.442534 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.442501 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kube-api-access-qf7js" (OuterVolumeSpecName: "kube-api-access-qf7js") pod "4a25ec7f-e571-4fe9-990c-beb3e63e92ea" (UID: "4a25ec7f-e571-4fe9-990c-beb3e63e92ea"). InnerVolumeSpecName "kube-api-access-qf7js". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:41:03.502511 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.502465 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4a25ec7f-e571-4fe9-990c-beb3e63e92ea" (UID: "4a25ec7f-e571-4fe9-990c-beb3e63e92ea"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:41:03.540861 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.540817 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:41:03.540861 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.540864 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:41:03.541081 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.540880 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:41:03.541081 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.540895 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qf7js\" (UniqueName: \"kubernetes.io/projected/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kube-api-access-qf7js\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:41:03.541081 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:03.540910 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a25ec7f-e571-4fe9-990c-beb3e63e92ea-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:41:04.305934 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:04.305900 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc_4a25ec7f-e571-4fe9-990c-beb3e63e92ea/main/0.log" Apr 16 16:41:04.306372 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:04.306350 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" Apr 16 16:41:04.306472 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:04.306345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc" event={"ID":"4a25ec7f-e571-4fe9-990c-beb3e63e92ea","Type":"ContainerDied","Data":"eb539abec329764d883131b1820dc113261a7625ba17e51d2034e2f43e9fb3b6"} Apr 16 16:41:04.306519 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:04.306497 2577 scope.go:117] "RemoveContainer" containerID="4db9f7d3890930c3de9cdd962a57e278e6347595868f7a2293b76fae0fe0b6d1" Apr 16 16:41:04.327438 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:04.327417 2577 scope.go:117] "RemoveContainer" containerID="5bcd44824d2a0c8d8f96da490bcf7832fabbe4348977b8878f4cddb1248dafac" Apr 16 16:41:04.335417 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:04.335387 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc"] Apr 16 16:41:04.343573 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:04.343544 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-5c6f4f9f98762tc"] Apr 16 16:41:05.494497 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:05.494432 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 16:41:05.651246 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:05.651212 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" path="/var/lib/kubelet/pods/4a25ec7f-e571-4fe9-990c-beb3e63e92ea/volumes" Apr 16 16:41:10.287143 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:10.287097 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 16:41:15.494507 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:15.494427 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 16:41:20.287307 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:20.287252 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 16:41:25.494351 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:25.494285 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 16:41:30.286983 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:30.286940 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 16:41:35.503731 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:35.503696 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:41:35.511781 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:35.511746 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:41:36.554081 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:36.554047 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:36.554579 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:36.554141 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs podName:925b12bd-cb2e-4cdb-90ff-b5265260bfee nodeName:}" failed. No retries permitted until 2026-04-16 16:41:37.054117394 +0000 UTC m=+1126.018897871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs") pod "stop-feature-test-kserve-5f6c45bb9b-pnzp2" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:36.557049 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:36.557020 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2"] Apr 16 16:41:37.057508 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:37.057470 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:37.057678 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:37.057558 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs podName:925b12bd-cb2e-4cdb-90ff-b5265260bfee nodeName:}" failed. No retries permitted until 2026-04-16 16:41:38.057539917 +0000 UTC m=+1127.022320392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs") pod "stop-feature-test-kserve-5f6c45bb9b-pnzp2" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:37.434485 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:37.434406 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" containerID="cri-o://30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2" gracePeriod=30 Apr 16 16:41:38.074727 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:38.074687 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:38.075208 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:38.074769 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs podName:925b12bd-cb2e-4cdb-90ff-b5265260bfee nodeName:}" failed. No retries permitted until 2026-04-16 16:41:40.07474891 +0000 UTC m=+1129.039529387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs") pod "stop-feature-test-kserve-5f6c45bb9b-pnzp2" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:40.093316 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:40.093284 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:40.093730 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:40.093363 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs podName:925b12bd-cb2e-4cdb-90ff-b5265260bfee nodeName:}" failed. No retries permitted until 2026-04-16 16:41:44.09334828 +0000 UTC m=+1133.058128759 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs") pod "stop-feature-test-kserve-5f6c45bb9b-pnzp2" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:40.287749 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:40.287695 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 16:41:44.128886 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:44.128841 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:44.129304 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:44.128949 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs podName:925b12bd-cb2e-4cdb-90ff-b5265260bfee nodeName:}" failed. No retries permitted until 2026-04-16 16:41:52.128926302 +0000 UTC m=+1141.093706794 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs") pod "stop-feature-test-kserve-5f6c45bb9b-pnzp2" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:50.287636 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:50.287587 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 16:41:52.200757 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:52.200719 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:52.201154 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:41:52.200799 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs podName:925b12bd-cb2e-4cdb-90ff-b5265260bfee nodeName:}" failed. No retries permitted until 2026-04-16 16:42:08.200783906 +0000 UTC m=+1157.165564380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs") pod "stop-feature-test-kserve-5f6c45bb9b-pnzp2" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 16:41:56.413989 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.413954 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd"] Apr 16 16:41:56.414547 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.414528 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" Apr 16 16:41:56.414606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.414551 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" Apr 16 16:41:56.414606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.414571 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="storage-initializer" Apr 16 16:41:56.414606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.414581 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="storage-initializer" Apr 16 16:41:56.414739 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.414681 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a25ec7f-e571-4fe9-990c-beb3e63e92ea" containerName="main" Apr 16 16:41:56.419265 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.419245 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.445462 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.445415 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd"] Apr 16 16:41:56.544875 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.544836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/82aadc57-1271-4666-b8fc-078cd616fef7-tls-certs\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.544875 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.544877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-model-cache\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.545104 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.544958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-home\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.545104 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.544999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-kserve-provision-location\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.545104 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.545092 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-dshm\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.545254 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.545170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7m4s\" (UniqueName: \"kubernetes.io/projected/82aadc57-1271-4666-b8fc-078cd616fef7-kube-api-access-p7m4s\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.645846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.645810 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-kserve-provision-location\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.646020 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.645869 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-dshm\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.646020 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.645905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7m4s\" (UniqueName: \"kubernetes.io/projected/82aadc57-1271-4666-b8fc-078cd616fef7-kube-api-access-p7m4s\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.646020 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.645939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/82aadc57-1271-4666-b8fc-078cd616fef7-tls-certs\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.646020 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.645956 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-model-cache\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.646020 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.645999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-home\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.646287 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.646260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-kserve-provision-location\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.646576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.646551 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-model-cache\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.646576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.646564 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-home\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.648279 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.648250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-dshm\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.648531 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.648514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/82aadc57-1271-4666-b8fc-078cd616fef7-tls-certs\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.675989 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.675909 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7m4s\" (UniqueName: \"kubernetes.io/projected/82aadc57-1271-4666-b8fc-078cd616fef7-kube-api-access-p7m4s\") pod \"stop-feature-test-kserve-5f6c45bb9b-slkzd\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.730047 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.730006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:41:56.901168 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.901130 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd"] Apr 16 16:41:56.905229 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:41:56.905196 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82aadc57_1271_4666_b8fc_078cd616fef7.slice/crio-2595de567c04af3b503fca8da9abdf17feacc626f034fae62dd5d45fe8686618 WatchSource:0}: Error finding container 2595de567c04af3b503fca8da9abdf17feacc626f034fae62dd5d45fe8686618: Status 404 returned error can't find the container with id 2595de567c04af3b503fca8da9abdf17feacc626f034fae62dd5d45fe8686618 Apr 16 16:41:56.907424 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:56.907409 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:41:57.505198 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:57.505163 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" event={"ID":"82aadc57-1271-4666-b8fc-078cd616fef7","Type":"ContainerStarted","Data":"8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f"} Apr 16 16:41:57.505198 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:41:57.505199 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" event={"ID":"82aadc57-1271-4666-b8fc-078cd616fef7","Type":"ContainerStarted","Data":"2595de567c04af3b503fca8da9abdf17feacc626f034fae62dd5d45fe8686618"} Apr 16 16:42:00.287097 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:00.287048 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 16:42:01.522492 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:01.522456 2577 generic.go:358] "Generic (PLEG): container finished" podID="82aadc57-1271-4666-b8fc-078cd616fef7" containerID="8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f" exitCode=0 Apr 16 16:42:01.522963 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:01.522519 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" event={"ID":"82aadc57-1271-4666-b8fc-078cd616fef7","Type":"ContainerDied","Data":"8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f"} Apr 16 16:42:02.527963 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:02.527924 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" event={"ID":"82aadc57-1271-4666-b8fc-078cd616fef7","Type":"ContainerStarted","Data":"cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f"} Apr 16 16:42:02.568275 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:02.568212 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podStartSLOduration=6.568191006 podStartE2EDuration="6.568191006s" podCreationTimestamp="2026-04-16 16:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:42:02.566574866 +0000 UTC m=+1151.531355376" watchObservedRunningTime="2026-04-16 16:42:02.568191006 +0000 UTC m=+1151.532971505" Apr 16 16:42:06.730801 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:06.730759 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:42:06.731314 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:06.730814 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:42:06.732209 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:06.732179 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 16:42:07.728993 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.728964 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5f6c45bb9b-pnzp2_925b12bd-cb2e-4cdb-90ff-b5265260bfee/main/0.log" Apr 16 16:42:07.729371 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.729354 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:42:07.863387 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.863357 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kserve-provision-location\") pod \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " Apr 16 16:42:07.863387 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.863401 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpsgw\" (UniqueName: \"kubernetes.io/projected/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kube-api-access-bpsgw\") pod \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " Apr 16 16:42:07.863921 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.863416 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-dshm\") pod \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " Apr 16 16:42:07.863921 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.863468 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-model-cache\") pod \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " Apr 16 16:42:07.863921 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.863542 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-home\") pod \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " Apr 16 16:42:07.863921 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.863572 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs\") pod \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\" (UID: \"925b12bd-cb2e-4cdb-90ff-b5265260bfee\") " Apr 16 16:42:07.864112 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.864027 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-model-cache" (OuterVolumeSpecName: "model-cache") pod "925b12bd-cb2e-4cdb-90ff-b5265260bfee" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:42:07.864229 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.864185 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-home" (OuterVolumeSpecName: "home") pod "925b12bd-cb2e-4cdb-90ff-b5265260bfee" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:42:07.866220 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.866185 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kube-api-access-bpsgw" (OuterVolumeSpecName: "kube-api-access-bpsgw") pod "925b12bd-cb2e-4cdb-90ff-b5265260bfee" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee"). InnerVolumeSpecName "kube-api-access-bpsgw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:42:07.866417 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.866394 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "925b12bd-cb2e-4cdb-90ff-b5265260bfee" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:42:07.866522 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.866394 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-dshm" (OuterVolumeSpecName: "dshm") pod "925b12bd-cb2e-4cdb-90ff-b5265260bfee" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:42:07.921457 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.921376 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "925b12bd-cb2e-4cdb-90ff-b5265260bfee" (UID: "925b12bd-cb2e-4cdb-90ff-b5265260bfee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:42:07.964826 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.964732 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:07.964826 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.964757 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/925b12bd-cb2e-4cdb-90ff-b5265260bfee-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:07.964826 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.964768 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:07.964826 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.964779 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpsgw\" (UniqueName: \"kubernetes.io/projected/925b12bd-cb2e-4cdb-90ff-b5265260bfee-kube-api-access-bpsgw\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:07.964826 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.964788 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:07.964826 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:07.964798 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/925b12bd-cb2e-4cdb-90ff-b5265260bfee-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:08.554675 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.554643 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5f6c45bb9b-pnzp2_925b12bd-cb2e-4cdb-90ff-b5265260bfee/main/0.log" Apr 16 16:42:08.554998 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.554977 2577 generic.go:358] "Generic (PLEG): container finished" podID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerID="30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2" exitCode=137 Apr 16 16:42:08.555094 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.555047 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" Apr 16 16:42:08.555094 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.555055 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" event={"ID":"925b12bd-cb2e-4cdb-90ff-b5265260bfee","Type":"ContainerDied","Data":"30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2"} Apr 16 16:42:08.555201 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.555099 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2" event={"ID":"925b12bd-cb2e-4cdb-90ff-b5265260bfee","Type":"ContainerDied","Data":"52a82c215570b50e8b369fb52a27940606bef2c07f80cbd664154e0e2a659240"} Apr 16 16:42:08.555201 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.555123 2577 scope.go:117] "RemoveContainer" containerID="30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2" Apr 16 16:42:08.584764 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.584738 2577 scope.go:117] "RemoveContainer" containerID="07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217" Apr 16 16:42:08.591516 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.591470 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2"] Apr 16 16:42:08.597899 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.597866 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-pnzp2"] Apr 16 16:42:08.667853 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.667828 2577 scope.go:117] "RemoveContainer" containerID="30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2" Apr 16 16:42:08.668208 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:42:08.668178 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2\": container with ID starting with 30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2 not found: ID does not exist" containerID="30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2" Apr 16 16:42:08.668268 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.668210 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2"} err="failed to get container status \"30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2\": rpc error: code = NotFound desc = could not find container \"30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2\": container with ID starting with 30911a111c8579669e21110a864d42c1017d820fe9f02a0b782f08c80374e9d2 not found: ID does not exist" Apr 16 16:42:08.668268 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.668229 2577 scope.go:117] "RemoveContainer" containerID="07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217" Apr 16 16:42:08.668533 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:42:08.668508 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217\": container with ID starting with 07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217 not found: ID does not exist" containerID="07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217" Apr 16 16:42:08.668581 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:08.668534 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217"} err="failed to get container status \"07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217\": rpc error: code = NotFound desc = could not find container \"07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217\": container with ID starting with 07747db9b1ed8719928e3d6d11eefe089992522a8e0e9305b0ef713e39cfc217 not found: ID does not exist" Apr 16 16:42:09.653345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:09.653257 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" path="/var/lib/kubelet/pods/925b12bd-cb2e-4cdb-90ff-b5265260bfee/volumes" Apr 16 16:42:10.288003 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:10.287949 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 16:42:16.731087 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:16.731042 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 16:42:20.297907 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:20.297872 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:42:20.306259 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:20.306229 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:42:26.731099 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:26.731029 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 16:42:27.015976 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:27.015877 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj"] Apr 16 16:42:27.016320 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:27.016257 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" containerID="cri-o://c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc" gracePeriod=30 Apr 16 16:42:36.730684 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:36.730640 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 16:42:38.783542 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.783505 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj"] Apr 16 16:42:38.784063 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.784043 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="storage-initializer" Apr 16 16:42:38.784128 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.784067 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="storage-initializer" Apr 16 16:42:38.784128 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.784091 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" Apr 16 16:42:38.784128 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.784100 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" Apr 16 16:42:38.784279 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.784199 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="925b12bd-cb2e-4cdb-90ff-b5265260bfee" containerName="main" Apr 16 16:42:38.789072 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.789042 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.792264 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.792236 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 16:42:38.799729 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.799601 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj"] Apr 16 16:42:38.880369 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.880329 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-dshm\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.880579 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.880378 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tnxw\" (UniqueName: \"kubernetes.io/projected/6011ca43-9233-40aa-994e-5f3acaabf2ba-kube-api-access-9tnxw\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.880579 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.880525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6011ca43-9233-40aa-994e-5f3acaabf2ba-tls-certs\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.880688 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.880576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-home\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.880688 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.880615 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-kserve-provision-location\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.880688 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.880672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-model-cache\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.982004 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.981962 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-model-cache\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.982212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.982017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-dshm\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.982212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.982043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tnxw\" (UniqueName: \"kubernetes.io/projected/6011ca43-9233-40aa-994e-5f3acaabf2ba-kube-api-access-9tnxw\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.982212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.982074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6011ca43-9233-40aa-994e-5f3acaabf2ba-tls-certs\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.982212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.982101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-home\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.982212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.982137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-kserve-provision-location\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.982515 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.982467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-model-cache\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.982577 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.982536 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-home\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.982637 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.982598 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-kserve-provision-location\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.984405 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.984383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-dshm\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.984690 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.984673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6011ca43-9233-40aa-994e-5f3acaabf2ba-tls-certs\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:38.990984 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:38.990963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tnxw\" (UniqueName: \"kubernetes.io/projected/6011ca43-9233-40aa-994e-5f3acaabf2ba-kube-api-access-9tnxw\") pod \"router-with-refs-test-kserve-575fcb7644-7xbkj\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:39.105046 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:39.104959 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:39.240074 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:39.240048 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj"] Apr 16 16:42:39.242082 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:42:39.242053 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6011ca43_9233_40aa_994e_5f3acaabf2ba.slice/crio-c953792a1e8a126bcb6dead20c12bd5f1257a9586db1131a260faaeb912c8934 WatchSource:0}: Error finding container c953792a1e8a126bcb6dead20c12bd5f1257a9586db1131a260faaeb912c8934: Status 404 returned error can't find the container with id c953792a1e8a126bcb6dead20c12bd5f1257a9586db1131a260faaeb912c8934 Apr 16 16:42:39.693226 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:39.693187 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" event={"ID":"6011ca43-9233-40aa-994e-5f3acaabf2ba","Type":"ContainerStarted","Data":"d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0"} Apr 16 16:42:39.693226 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:39.693230 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" event={"ID":"6011ca43-9233-40aa-994e-5f3acaabf2ba","Type":"ContainerStarted","Data":"c953792a1e8a126bcb6dead20c12bd5f1257a9586db1131a260faaeb912c8934"} Apr 16 16:42:44.715207 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:44.715164 2577 generic.go:358] "Generic (PLEG): container finished" podID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerID="d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0" exitCode=0 Apr 16 16:42:44.715600 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:44.715237 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" event={"ID":"6011ca43-9233-40aa-994e-5f3acaabf2ba","Type":"ContainerDied","Data":"d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0"} Apr 16 16:42:45.721952 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:45.721910 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" event={"ID":"6011ca43-9233-40aa-994e-5f3acaabf2ba","Type":"ContainerStarted","Data":"2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360"} Apr 16 16:42:45.743402 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:45.743339 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podStartSLOduration=7.743324022 podStartE2EDuration="7.743324022s" podCreationTimestamp="2026-04-16 16:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:42:45.741972384 +0000 UTC m=+1194.706752880" watchObservedRunningTime="2026-04-16 16:42:45.743324022 +0000 UTC m=+1194.708104518" Apr 16 16:42:46.731297 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:46.731250 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 16:42:49.105342 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:49.105291 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:49.105342 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:49.105344 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:42:49.106859 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:49.106824 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 16:42:51.604194 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:51.604163 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:42:51.604771 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:51.604347 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:42:56.731220 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:56.731160 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 16:42:57.299539 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.299510 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-786fb6d9fb-f28dj_28faf7fa-6d32-4452-a411-1a6061173dae/main/0.log" Apr 16 16:42:57.299890 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.299867 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:42:57.348899 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.348869 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-kserve-provision-location\") pod \"28faf7fa-6d32-4452-a411-1a6061173dae\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " Apr 16 16:42:57.349070 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.348908 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/28faf7fa-6d32-4452-a411-1a6061173dae-tls-certs\") pod \"28faf7fa-6d32-4452-a411-1a6061173dae\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " Apr 16 16:42:57.349070 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.348955 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-dshm\") pod \"28faf7fa-6d32-4452-a411-1a6061173dae\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " Apr 16 16:42:57.349236 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.349112 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-home\") pod \"28faf7fa-6d32-4452-a411-1a6061173dae\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " Apr 16 16:42:57.349236 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.349158 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-model-cache\") pod \"28faf7fa-6d32-4452-a411-1a6061173dae\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " Apr 16 16:42:57.349236 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.349195 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh8tw\" (UniqueName: \"kubernetes.io/projected/28faf7fa-6d32-4452-a411-1a6061173dae-kube-api-access-nh8tw\") pod \"28faf7fa-6d32-4452-a411-1a6061173dae\" (UID: \"28faf7fa-6d32-4452-a411-1a6061173dae\") " Apr 16 16:42:57.349648 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.349536 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-home" (OuterVolumeSpecName: "home") pod "28faf7fa-6d32-4452-a411-1a6061173dae" (UID: "28faf7fa-6d32-4452-a411-1a6061173dae"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:42:57.349648 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.349607 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-model-cache" (OuterVolumeSpecName: "model-cache") pod "28faf7fa-6d32-4452-a411-1a6061173dae" (UID: "28faf7fa-6d32-4452-a411-1a6061173dae"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:42:57.351191 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.351171 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28faf7fa-6d32-4452-a411-1a6061173dae-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "28faf7fa-6d32-4452-a411-1a6061173dae" (UID: "28faf7fa-6d32-4452-a411-1a6061173dae"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:42:57.352027 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.352009 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-dshm" (OuterVolumeSpecName: "dshm") pod "28faf7fa-6d32-4452-a411-1a6061173dae" (UID: "28faf7fa-6d32-4452-a411-1a6061173dae"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:42:57.352118 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.352092 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28faf7fa-6d32-4452-a411-1a6061173dae-kube-api-access-nh8tw" (OuterVolumeSpecName: "kube-api-access-nh8tw") pod "28faf7fa-6d32-4452-a411-1a6061173dae" (UID: "28faf7fa-6d32-4452-a411-1a6061173dae"). InnerVolumeSpecName "kube-api-access-nh8tw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:42:57.414565 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.414522 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "28faf7fa-6d32-4452-a411-1a6061173dae" (UID: "28faf7fa-6d32-4452-a411-1a6061173dae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:42:57.450096 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.450064 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:57.450236 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.450103 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/28faf7fa-6d32-4452-a411-1a6061173dae-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:57.450236 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.450118 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:57.450236 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.450131 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:57.450236 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.450146 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/28faf7fa-6d32-4452-a411-1a6061173dae-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:57.450236 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.450159 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nh8tw\" (UniqueName: \"kubernetes.io/projected/28faf7fa-6d32-4452-a411-1a6061173dae-kube-api-access-nh8tw\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:42:57.777129 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.777087 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-786fb6d9fb-f28dj_28faf7fa-6d32-4452-a411-1a6061173dae/main/0.log" Apr 16 16:42:57.777577 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.777552 2577 generic.go:358] "Generic (PLEG): container finished" podID="28faf7fa-6d32-4452-a411-1a6061173dae" containerID="c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc" exitCode=137 Apr 16 16:42:57.777644 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.777631 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" Apr 16 16:42:57.777683 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.777640 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" event={"ID":"28faf7fa-6d32-4452-a411-1a6061173dae","Type":"ContainerDied","Data":"c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc"} Apr 16 16:42:57.777736 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.777685 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj" event={"ID":"28faf7fa-6d32-4452-a411-1a6061173dae","Type":"ContainerDied","Data":"50b18c5b092a0c8ec15b94c5a3f755da703b16b7b9d2bb2b554f8c58c27aeff8"} Apr 16 16:42:57.777736 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.777709 2577 scope.go:117] "RemoveContainer" containerID="c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc" Apr 16 16:42:57.801530 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.801496 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj"] Apr 16 16:42:57.803393 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.803367 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-786fb6d9fb-f28dj"] Apr 16 16:42:57.805238 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.805219 2577 scope.go:117] "RemoveContainer" containerID="606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5" Apr 16 16:42:57.870145 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.870114 2577 scope.go:117] "RemoveContainer" containerID="c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc" Apr 16 16:42:57.870678 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:42:57.870651 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc\": container with ID starting with c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc not found: ID does not exist" containerID="c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc" Apr 16 16:42:57.870796 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.870692 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc"} err="failed to get container status \"c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc\": rpc error: code = NotFound desc = could not find container \"c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc\": container with ID starting with c89e2d35d84de315ac6656170404c7ae5c781d6185db87fcd007c03d71c5dfdc not found: ID does not exist" Apr 16 16:42:57.870796 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.870720 2577 scope.go:117] "RemoveContainer" containerID="606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5" Apr 16 16:42:57.871086 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:42:57.871055 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5\": container with ID starting with 606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5 not found: ID does not exist" containerID="606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5" Apr 16 16:42:57.871191 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:57.871094 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5"} err="failed to get container status \"606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5\": rpc error: code = NotFound desc = could not find container \"606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5\": container with ID starting with 606d9e2a526de58966727e2a5fdd644fa9b0cbf3a9c8b1154dfb5cde7e2c59f5 not found: ID does not exist" Apr 16 16:42:59.106192 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:59.106141 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 16:42:59.653253 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:42:59.653211 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" path="/var/lib/kubelet/pods/28faf7fa-6d32-4452-a411-1a6061173dae/volumes" Apr 16 16:43:06.730910 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:06.730858 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 16:43:09.106206 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:09.106156 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 16:43:16.731375 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:16.731311 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 16:43:19.105950 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:19.105895 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 16:43:26.731097 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:26.731055 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 16:43:29.105833 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:29.105784 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 16:43:36.731014 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:36.730966 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 16:43:39.106360 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:39.106248 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 16:43:46.740708 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:46.740668 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:43:46.748970 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:46.748933 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:43:47.929766 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:47.929718 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd"] Apr 16 16:43:47.959864 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:47.959794 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" containerID="cri-o://cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f" gracePeriod=30 Apr 16 16:43:49.105815 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:49.105775 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 16:43:59.106426 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:43:59.106383 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 16:44:09.106085 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:09.106026 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 16:44:18.246170 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.246143 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5f6c45bb9b-slkzd_82aadc57-1271-4666-b8fc-078cd616fef7/main/0.log" Apr 16 16:44:18.246571 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.246556 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:44:18.391610 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.391580 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-home\") pod \"82aadc57-1271-4666-b8fc-078cd616fef7\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " Apr 16 16:44:18.391610 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.391615 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-dshm\") pod \"82aadc57-1271-4666-b8fc-078cd616fef7\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " Apr 16 16:44:18.391852 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.391680 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/82aadc57-1271-4666-b8fc-078cd616fef7-tls-certs\") pod \"82aadc57-1271-4666-b8fc-078cd616fef7\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " Apr 16 16:44:18.391852 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.391702 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-model-cache\") pod \"82aadc57-1271-4666-b8fc-078cd616fef7\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " Apr 16 16:44:18.391852 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.391729 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7m4s\" (UniqueName: \"kubernetes.io/projected/82aadc57-1271-4666-b8fc-078cd616fef7-kube-api-access-p7m4s\") pod \"82aadc57-1271-4666-b8fc-078cd616fef7\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " Apr 16 16:44:18.391852 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.391753 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-kserve-provision-location\") pod \"82aadc57-1271-4666-b8fc-078cd616fef7\" (UID: \"82aadc57-1271-4666-b8fc-078cd616fef7\") " Apr 16 16:44:18.392066 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.392038 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-model-cache" (OuterVolumeSpecName: "model-cache") pod "82aadc57-1271-4666-b8fc-078cd616fef7" (UID: "82aadc57-1271-4666-b8fc-078cd616fef7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:44:18.392119 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.392054 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-home" (OuterVolumeSpecName: "home") pod "82aadc57-1271-4666-b8fc-078cd616fef7" (UID: "82aadc57-1271-4666-b8fc-078cd616fef7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:44:18.393959 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.393908 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82aadc57-1271-4666-b8fc-078cd616fef7-kube-api-access-p7m4s" (OuterVolumeSpecName: "kube-api-access-p7m4s") pod "82aadc57-1271-4666-b8fc-078cd616fef7" (UID: "82aadc57-1271-4666-b8fc-078cd616fef7"). InnerVolumeSpecName "kube-api-access-p7m4s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:44:18.394322 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.394297 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82aadc57-1271-4666-b8fc-078cd616fef7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "82aadc57-1271-4666-b8fc-078cd616fef7" (UID: "82aadc57-1271-4666-b8fc-078cd616fef7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:44:18.394408 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.394302 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-dshm" (OuterVolumeSpecName: "dshm") pod "82aadc57-1271-4666-b8fc-078cd616fef7" (UID: "82aadc57-1271-4666-b8fc-078cd616fef7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:44:18.456962 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.456914 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "82aadc57-1271-4666-b8fc-078cd616fef7" (UID: "82aadc57-1271-4666-b8fc-078cd616fef7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:44:18.493546 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.493439 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/82aadc57-1271-4666-b8fc-078cd616fef7-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:44:18.493546 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.493491 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:44:18.493546 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.493501 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p7m4s\" (UniqueName: \"kubernetes.io/projected/82aadc57-1271-4666-b8fc-078cd616fef7-kube-api-access-p7m4s\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:44:18.493546 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.493512 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:44:18.493546 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.493522 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:44:18.493546 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:18.493530 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/82aadc57-1271-4666-b8fc-078cd616fef7-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:44:19.074267 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.074235 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5f6c45bb9b-slkzd_82aadc57-1271-4666-b8fc-078cd616fef7/main/0.log" Apr 16 16:44:19.074649 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.074617 2577 generic.go:358] "Generic (PLEG): container finished" podID="82aadc57-1271-4666-b8fc-078cd616fef7" containerID="cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f" exitCode=137 Apr 16 16:44:19.074805 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.074661 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" event={"ID":"82aadc57-1271-4666-b8fc-078cd616fef7","Type":"ContainerDied","Data":"cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f"} Apr 16 16:44:19.074805 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.074683 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" event={"ID":"82aadc57-1271-4666-b8fc-078cd616fef7","Type":"ContainerDied","Data":"2595de567c04af3b503fca8da9abdf17feacc626f034fae62dd5d45fe8686618"} Apr 16 16:44:19.074805 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.074697 2577 scope.go:117] "RemoveContainer" containerID="cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f" Apr 16 16:44:19.074805 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.074717 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd" Apr 16 16:44:19.095235 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.095156 2577 scope.go:117] "RemoveContainer" containerID="8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f" Apr 16 16:44:19.100291 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.098466 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd"] Apr 16 16:44:19.105761 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.105721 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 16:44:19.107165 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.107140 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f6c45bb9b-slkzd"] Apr 16 16:44:19.187533 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.187510 2577 scope.go:117] "RemoveContainer" containerID="cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f" Apr 16 16:44:19.187899 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:44:19.187873 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f\": container with ID starting with cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f not found: ID does not exist" containerID="cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f" Apr 16 16:44:19.187957 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.187910 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f"} err="failed to get container status \"cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f\": rpc error: code = NotFound desc = could not find container \"cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f\": container with ID starting with cf9f024ff6f2b16fbb1e590069067a6a80c18583da1ae0ec1463ea7890707b3f not found: ID does not exist" Apr 16 16:44:19.187957 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.187930 2577 scope.go:117] "RemoveContainer" containerID="8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f" Apr 16 16:44:19.188251 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:44:19.188226 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f\": container with ID starting with 8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f not found: ID does not exist" containerID="8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f" Apr 16 16:44:19.188330 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.188260 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f"} err="failed to get container status \"8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f\": rpc error: code = NotFound desc = could not find container \"8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f\": container with ID starting with 8611e50cdc54b1dcbbfe818ce4e4490f0ca7f37c4e9ac94a25facf618f86132f not found: ID does not exist" Apr 16 16:44:19.654047 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:19.654007 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" path="/var/lib/kubelet/pods/82aadc57-1271-4666-b8fc-078cd616fef7/volumes" Apr 16 16:44:29.116164 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:29.116130 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:44:29.124399 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:29.124371 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:44:47.617312 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:47.617276 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj"] Apr 16 16:44:47.619865 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:47.617674 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" containerID="cri-o://2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360" gracePeriod=30 Apr 16 16:44:53.010688 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.010655 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg"] Apr 16 16:44:53.011208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.011048 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="storage-initializer" Apr 16 16:44:53.011208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.011062 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="storage-initializer" Apr 16 16:44:53.011208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.011078 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" Apr 16 16:44:53.011208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.011086 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" Apr 16 16:44:53.011208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.011098 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" Apr 16 16:44:53.011208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.011107 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" Apr 16 16:44:53.011208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.011118 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="storage-initializer" Apr 16 16:44:53.011208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.011124 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="storage-initializer" Apr 16 16:44:53.011208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.011186 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="82aadc57-1271-4666-b8fc-078cd616fef7" containerName="main" Apr 16 16:44:53.011208 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.011197 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="28faf7fa-6d32-4452-a411-1a6061173dae" containerName="main" Apr 16 16:44:53.014921 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.014903 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.018987 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.018959 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-6wd9g\"" Apr 16 16:44:53.019100 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.019016 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 16:44:53.043683 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.043654 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg"] Apr 16 16:44:53.064302 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.064265 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk"] Apr 16 16:44:53.067745 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.067724 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.093564 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.093535 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk"] Apr 16 16:44:53.115375 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.115341 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.115589 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.115395 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/239f5ac8-ab81-4824-9309-d7950f9dc58f-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.115589 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.115486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.115589 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.115544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kkzc\" (UniqueName: \"kubernetes.io/projected/239f5ac8-ab81-4824-9309-d7950f9dc58f-kube-api-access-2kkzc\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.115589 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.115575 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.115758 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.115598 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.216296 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.216296 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216309 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.216606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5485249-290d-42c8-b274-6111f1454a7f-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.216606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.216606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kkzc\" (UniqueName: \"kubernetes.io/projected/239f5ac8-ab81-4824-9309-d7950f9dc58f-kube-api-access-2kkzc\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.216606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.216606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.216872 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216655 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.216872 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216712 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b27b\" (UniqueName: \"kubernetes.io/projected/e5485249-290d-42c8-b274-6111f1454a7f-kube-api-access-7b27b\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.216872 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216758 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.216872 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216767 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.216872 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.216872 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216829 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.217164 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.216886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/239f5ac8-ab81-4824-9309-d7950f9dc58f-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.217164 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.217117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.218840 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.218823 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.219264 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.219244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/239f5ac8-ab81-4824-9309-d7950f9dc58f-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.227974 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.227953 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kkzc\" (UniqueName: \"kubernetes.io/projected/239f5ac8-ab81-4824-9309-d7950f9dc58f-kube-api-access-2kkzc\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.318202 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.318100 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.318202 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.318154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b27b\" (UniqueName: \"kubernetes.io/projected/e5485249-290d-42c8-b274-6111f1454a7f-kube-api-access-7b27b\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.318202 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.318182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.318521 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.318228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.318521 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.318255 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5485249-290d-42c8-b274-6111f1454a7f-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.318521 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.318281 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.318709 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.318689 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.318767 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.318743 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.318817 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.318755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.320430 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.320408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.320777 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.320758 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5485249-290d-42c8-b274-6111f1454a7f-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.324649 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.324613 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:53.331330 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.331271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b27b\" (UniqueName: \"kubernetes.io/projected/e5485249-290d-42c8-b274-6111f1454a7f-kube-api-access-7b27b\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.377814 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.377761 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:44:53.481736 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.481707 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg"] Apr 16 16:44:53.483342 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:44:53.483310 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod239f5ac8_ab81_4824_9309_d7950f9dc58f.slice/crio-8bf1b952707893146c85140fefb61a1efa73067b151063d5096b16223c34571c WatchSource:0}: Error finding container 8bf1b952707893146c85140fefb61a1efa73067b151063d5096b16223c34571c: Status 404 returned error can't find the container with id 8bf1b952707893146c85140fefb61a1efa73067b151063d5096b16223c34571c Apr 16 16:44:53.525715 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:53.525675 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk"] Apr 16 16:44:53.529467 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:44:53.529413 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5485249_290d_42c8_b274_6111f1454a7f.slice/crio-0d52d299d4a4b70d7995dfdbb73a43288d7fd13c08472bb110684e42eaf758c7 WatchSource:0}: Error finding container 0d52d299d4a4b70d7995dfdbb73a43288d7fd13c08472bb110684e42eaf758c7: Status 404 returned error can't find the container with id 0d52d299d4a4b70d7995dfdbb73a43288d7fd13c08472bb110684e42eaf758c7 Apr 16 16:44:54.201820 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:54.201774 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" event={"ID":"e5485249-290d-42c8-b274-6111f1454a7f","Type":"ContainerStarted","Data":"58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb"} Apr 16 16:44:54.201820 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:54.201825 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" event={"ID":"e5485249-290d-42c8-b274-6111f1454a7f","Type":"ContainerStarted","Data":"0d52d299d4a4b70d7995dfdbb73a43288d7fd13c08472bb110684e42eaf758c7"} Apr 16 16:44:54.203661 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:54.203626 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" event={"ID":"239f5ac8-ab81-4824-9309-d7950f9dc58f","Type":"ContainerStarted","Data":"8bf1b952707893146c85140fefb61a1efa73067b151063d5096b16223c34571c"} Apr 16 16:44:55.209323 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:55.209272 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" event={"ID":"239f5ac8-ab81-4824-9309-d7950f9dc58f","Type":"ContainerStarted","Data":"451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9"} Apr 16 16:44:56.219385 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:56.219346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" event={"ID":"239f5ac8-ab81-4824-9309-d7950f9dc58f","Type":"ContainerStarted","Data":"a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569"} Apr 16 16:44:56.219849 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:56.219636 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:44:57.471219 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.471177 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k"] Apr 16 16:44:57.478757 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.478719 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.482095 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.482023 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 16:44:57.486938 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.486904 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k"] Apr 16 16:44:57.672299 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.672253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.672513 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.672311 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.672513 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.672359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.672513 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.672395 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smbgf\" (UniqueName: \"kubernetes.io/projected/1b56fe14-94af-4d7e-8541-7468f7349e1e-kube-api-access-smbgf\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.672513 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.672484 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b56fe14-94af-4d7e-8541-7468f7349e1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.672722 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.672547 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.773745 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.773653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.773745 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.773708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smbgf\" (UniqueName: \"kubernetes.io/projected/1b56fe14-94af-4d7e-8541-7468f7349e1e-kube-api-access-smbgf\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.773745 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.773737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b56fe14-94af-4d7e-8541-7468f7349e1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.774004 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.773783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.774004 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.773862 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.774004 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.773908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.774436 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.774364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.774436 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.774387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.774927 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.774894 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.777536 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.777502 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.777867 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.777820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b56fe14-94af-4d7e-8541-7468f7349e1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.786156 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.786118 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smbgf\" (UniqueName: \"kubernetes.io/projected/1b56fe14-94af-4d7e-8541-7468f7349e1e-kube-api-access-smbgf\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.798553 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.798500 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:44:57.974192 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:57.973394 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k"] Apr 16 16:44:58.228556 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:58.228505 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" event={"ID":"1b56fe14-94af-4d7e-8541-7468f7349e1e","Type":"ContainerStarted","Data":"7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805"} Apr 16 16:44:58.228556 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:58.228550 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" event={"ID":"1b56fe14-94af-4d7e-8541-7468f7349e1e","Type":"ContainerStarted","Data":"c95af642d75fa9101ef77249d2c7adba61e59ae2f1748f5a3a1388128e08784f"} Apr 16 16:44:59.234474 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:59.234371 2577 generic.go:358] "Generic (PLEG): container finished" podID="e5485249-290d-42c8-b274-6111f1454a7f" containerID="58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb" exitCode=0 Apr 16 16:44:59.235154 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:44:59.234391 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" event={"ID":"e5485249-290d-42c8-b274-6111f1454a7f","Type":"ContainerDied","Data":"58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb"} Apr 16 16:45:00.241631 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:00.241589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" event={"ID":"e5485249-290d-42c8-b274-6111f1454a7f","Type":"ContainerStarted","Data":"bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1"} Apr 16 16:45:00.243415 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:00.243381 2577 generic.go:358] "Generic (PLEG): container finished" podID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerID="a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569" exitCode=0 Apr 16 16:45:00.243573 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:00.243467 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" event={"ID":"239f5ac8-ab81-4824-9309-d7950f9dc58f","Type":"ContainerDied","Data":"a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569"} Apr 16 16:45:00.264049 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:00.263992 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podStartSLOduration=7.26397534 podStartE2EDuration="7.26397534s" podCreationTimestamp="2026-04-16 16:44:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:45:00.261228362 +0000 UTC m=+1329.226008853" watchObservedRunningTime="2026-04-16 16:45:00.26397534 +0000 UTC m=+1329.228755883" Apr 16 16:45:01.250518 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:01.250473 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" event={"ID":"239f5ac8-ab81-4824-9309-d7950f9dc58f","Type":"ContainerStarted","Data":"f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4"} Apr 16 16:45:01.275609 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:01.275542 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podStartSLOduration=7.960931749 podStartE2EDuration="9.275522642s" podCreationTimestamp="2026-04-16 16:44:52 +0000 UTC" firstStartedPulling="2026-04-16 16:44:53.4854781 +0000 UTC m=+1322.450258578" lastFinishedPulling="2026-04-16 16:44:54.800068992 +0000 UTC m=+1323.764849471" observedRunningTime="2026-04-16 16:45:01.274632262 +0000 UTC m=+1330.239412782" watchObservedRunningTime="2026-04-16 16:45:01.275522642 +0000 UTC m=+1330.240303141" Apr 16 16:45:02.255346 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:02.255314 2577 generic.go:358] "Generic (PLEG): container finished" podID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerID="7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805" exitCode=0 Apr 16 16:45:02.255825 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:02.255360 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" event={"ID":"1b56fe14-94af-4d7e-8541-7468f7349e1e","Type":"ContainerDied","Data":"7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805"} Apr 16 16:45:03.261086 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:03.261047 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" event={"ID":"1b56fe14-94af-4d7e-8541-7468f7349e1e","Type":"ContainerStarted","Data":"6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa"} Apr 16 16:45:03.285674 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:03.285618 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podStartSLOduration=6.285600325 podStartE2EDuration="6.285600325s" podCreationTimestamp="2026-04-16 16:44:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:45:03.282165416 +0000 UTC m=+1332.246945930" watchObservedRunningTime="2026-04-16 16:45:03.285600325 +0000 UTC m=+1332.250380822" Apr 16 16:45:03.325191 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:03.325159 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:45:03.325191 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:03.325204 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:45:03.326824 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:03.326791 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:45:03.378537 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:03.378497 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:45:03.378537 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:03.378551 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:45:03.380355 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:03.380315 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:45:07.799182 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:07.799074 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:45:07.799672 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:07.799204 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:45:07.801169 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:07.801130 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:45:13.325769 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:13.325616 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:45:13.343944 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:13.343910 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:45:13.378647 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:13.378593 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:45:17.799999 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.799575 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:45:17.920200 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.920176 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-575fcb7644-7xbkj_6011ca43-9233-40aa-994e-5f3acaabf2ba/main/0.log" Apr 16 16:45:17.920619 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.920594 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:45:17.990519 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.990478 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-model-cache\") pod \"6011ca43-9233-40aa-994e-5f3acaabf2ba\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " Apr 16 16:45:17.990733 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.990542 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6011ca43-9233-40aa-994e-5f3acaabf2ba-tls-certs\") pod \"6011ca43-9233-40aa-994e-5f3acaabf2ba\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " Apr 16 16:45:17.990733 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.990592 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-dshm\") pod \"6011ca43-9233-40aa-994e-5f3acaabf2ba\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " Apr 16 16:45:17.990860 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.990726 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-home\") pod \"6011ca43-9233-40aa-994e-5f3acaabf2ba\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " Apr 16 16:45:17.990860 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.990772 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-model-cache" (OuterVolumeSpecName: "model-cache") pod "6011ca43-9233-40aa-994e-5f3acaabf2ba" (UID: "6011ca43-9233-40aa-994e-5f3acaabf2ba"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:45:17.990860 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.990801 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tnxw\" (UniqueName: \"kubernetes.io/projected/6011ca43-9233-40aa-994e-5f3acaabf2ba-kube-api-access-9tnxw\") pod \"6011ca43-9233-40aa-994e-5f3acaabf2ba\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " Apr 16 16:45:17.990860 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.990850 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-kserve-provision-location\") pod \"6011ca43-9233-40aa-994e-5f3acaabf2ba\" (UID: \"6011ca43-9233-40aa-994e-5f3acaabf2ba\") " Apr 16 16:45:17.991102 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.991082 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-home" (OuterVolumeSpecName: "home") pod "6011ca43-9233-40aa-994e-5f3acaabf2ba" (UID: "6011ca43-9233-40aa-994e-5f3acaabf2ba"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:45:17.991187 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.991162 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:45:17.991264 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.991186 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:45:17.995173 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.993555 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6011ca43-9233-40aa-994e-5f3acaabf2ba-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6011ca43-9233-40aa-994e-5f3acaabf2ba" (UID: "6011ca43-9233-40aa-994e-5f3acaabf2ba"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:45:17.997226 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:17.995840 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-dshm" (OuterVolumeSpecName: "dshm") pod "6011ca43-9233-40aa-994e-5f3acaabf2ba" (UID: "6011ca43-9233-40aa-994e-5f3acaabf2ba"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:45:18.005671 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.005075 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6011ca43-9233-40aa-994e-5f3acaabf2ba-kube-api-access-9tnxw" (OuterVolumeSpecName: "kube-api-access-9tnxw") pod "6011ca43-9233-40aa-994e-5f3acaabf2ba" (UID: "6011ca43-9233-40aa-994e-5f3acaabf2ba"). InnerVolumeSpecName "kube-api-access-9tnxw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:45:18.053495 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.053433 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6011ca43-9233-40aa-994e-5f3acaabf2ba" (UID: "6011ca43-9233-40aa-994e-5f3acaabf2ba"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:45:18.092724 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.092683 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9tnxw\" (UniqueName: \"kubernetes.io/projected/6011ca43-9233-40aa-994e-5f3acaabf2ba-kube-api-access-9tnxw\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:45:18.092724 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.092718 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:45:18.092724 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.092733 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6011ca43-9233-40aa-994e-5f3acaabf2ba-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:45:18.092957 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.092747 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6011ca43-9233-40aa-994e-5f3acaabf2ba-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:45:18.326989 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.326962 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-575fcb7644-7xbkj_6011ca43-9233-40aa-994e-5f3acaabf2ba/main/0.log" Apr 16 16:45:18.327378 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.327350 2577 generic.go:358] "Generic (PLEG): container finished" podID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerID="2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360" exitCode=137 Apr 16 16:45:18.327549 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.327396 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" event={"ID":"6011ca43-9233-40aa-994e-5f3acaabf2ba","Type":"ContainerDied","Data":"2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360"} Apr 16 16:45:18.327549 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.327421 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" event={"ID":"6011ca43-9233-40aa-994e-5f3acaabf2ba","Type":"ContainerDied","Data":"c953792a1e8a126bcb6dead20c12bd5f1257a9586db1131a260faaeb912c8934"} Apr 16 16:45:18.327549 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.327437 2577 scope.go:117] "RemoveContainer" containerID="2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360" Apr 16 16:45:18.327549 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.327526 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj" Apr 16 16:45:18.355862 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.355826 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj"] Apr 16 16:45:18.356634 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.356606 2577 scope.go:117] "RemoveContainer" containerID="d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0" Apr 16 16:45:18.361336 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.361311 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575fcb7644-7xbkj"] Apr 16 16:45:18.369185 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.369160 2577 scope.go:117] "RemoveContainer" containerID="2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360" Apr 16 16:45:18.369754 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:45:18.369727 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360\": container with ID starting with 2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360 not found: ID does not exist" containerID="2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360" Apr 16 16:45:18.369867 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.369766 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360"} err="failed to get container status \"2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360\": rpc error: code = NotFound desc = could not find container \"2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360\": container with ID starting with 2e977664529d56dbd076040274b4694034e15c26deadad500e069ab525011360 not found: ID does not exist" Apr 16 16:45:18.369867 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.369796 2577 scope.go:117] "RemoveContainer" containerID="d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0" Apr 16 16:45:18.370176 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:45:18.370139 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0\": container with ID starting with d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0 not found: ID does not exist" containerID="d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0" Apr 16 16:45:18.370242 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:18.370171 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0"} err="failed to get container status \"d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0\": rpc error: code = NotFound desc = could not find container \"d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0\": container with ID starting with d3db97e5a1441b96d5caf9566f9cbd0338c418fa5b4ed4ae51cd0bc1e8f334f0 not found: ID does not exist" Apr 16 16:45:19.653772 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:19.653733 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" path="/var/lib/kubelet/pods/6011ca43-9233-40aa-994e-5f3acaabf2ba/volumes" Apr 16 16:45:23.325556 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:23.325515 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:45:23.378335 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:23.378278 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:45:27.799703 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:27.799646 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:45:33.325690 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:33.325564 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:45:33.378648 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:33.378585 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:45:37.800028 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:37.799986 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:45:43.325931 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:43.325883 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:45:43.378689 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:43.378641 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:45:47.800160 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:47.799682 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:45:53.325765 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:53.325711 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:45:53.379124 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:53.379043 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:45:57.799646 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:45:57.799603 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:46:03.325832 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:03.325767 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:46:03.378413 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:03.378363 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:46:07.800058 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:07.800013 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:46:13.325632 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:13.325581 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:46:13.379006 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:13.378951 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:46:17.799237 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:17.799184 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:46:23.325793 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:23.325746 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:46:23.379207 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:23.379151 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:46:27.799458 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:27.799398 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:46:33.325484 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:33.325408 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:46:33.378723 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:33.378680 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:46:37.799226 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:37.799180 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:46:43.325660 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:43.325554 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:46:43.379151 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:43.379112 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:46:47.799547 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:47.799495 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:46:53.325174 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:53.325122 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:46:53.378331 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:53.378282 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:46:57.800031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:46:57.799971 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:47:03.326363 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:03.325930 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:47:03.378705 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:03.378659 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:47:07.799831 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:07.799789 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:47:13.325730 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:13.325623 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:47:13.378559 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:13.378507 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:47:17.799016 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:17.798966 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:47:23.325637 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:23.325586 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:47:23.378260 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:23.378218 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:47:27.799864 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:27.799820 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:47:33.325888 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:33.325839 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:47:33.379257 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:33.379208 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:47:37.799287 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:37.799236 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:47:43.325228 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:43.325177 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:47:43.378919 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:43.378878 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:47:47.799106 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:47.799052 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:47:51.634153 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:51.634115 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:47:51.636032 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:51.636007 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:47:53.326107 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:53.326060 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:47:53.379085 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:53.379034 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:47:57.799882 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:47:57.799842 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 16:48:03.325864 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:03.325800 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 16:48:03.378749 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:03.378696 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 16:48:07.809267 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:07.809232 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:48:07.817241 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:07.817214 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:48:12.271642 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:12.271567 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k"] Apr 16 16:48:12.272016 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:12.271851 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" containerID="cri-o://6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa" gracePeriod=30 Apr 16 16:48:13.334421 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:13.334393 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:48:13.346693 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:13.346668 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:48:13.388730 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:13.388696 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:48:13.398107 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:13.398073 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:48:17.239535 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.239496 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 16:48:17.240012 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.239996 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="storage-initializer" Apr 16 16:48:17.240065 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.240014 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="storage-initializer" Apr 16 16:48:17.240065 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.240025 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" Apr 16 16:48:17.240065 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.240030 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" Apr 16 16:48:17.240172 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.240088 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6011ca43-9233-40aa-994e-5f3acaabf2ba" containerName="main" Apr 16 16:48:17.244969 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.244948 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.249965 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.249942 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 16:48:17.251096 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.251074 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-7p4cb\"" Apr 16 16:48:17.283014 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.282979 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 16:48:17.329725 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.329686 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.329725 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.329726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.329966 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.329763 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/db43c9ac-2064-4f08-90ca-9c9258909279-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.329966 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.329794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.329966 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.329814 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.329966 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.329917 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6bt\" (UniqueName: \"kubernetes.io/projected/db43c9ac-2064-4f08-90ca-9c9258909279-kube-api-access-kx6bt\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.431226 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.431179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6bt\" (UniqueName: \"kubernetes.io/projected/db43c9ac-2064-4f08-90ca-9c9258909279-kube-api-access-kx6bt\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.431426 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.431301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.431426 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.431331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.431426 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.431362 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/db43c9ac-2064-4f08-90ca-9c9258909279-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.431426 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.431406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.431712 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.431428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.431781 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.431758 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.431872 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.431854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.431967 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.431944 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.433899 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.433873 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.434288 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.434085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/db43c9ac-2064-4f08-90ca-9c9258909279-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.449729 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.449702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx6bt\" (UniqueName: \"kubernetes.io/projected/db43c9ac-2064-4f08-90ca-9c9258909279-kube-api-access-kx6bt\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.555316 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.555220 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:17.699426 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.699398 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 16:48:17.701860 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:48:17.701827 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb43c9ac_2064_4f08_90ca_9c9258909279.slice/crio-ad972d9c1a28d8e0adf06203bc14eba455a790632468ca87d441aeb3d9812643 WatchSource:0}: Error finding container ad972d9c1a28d8e0adf06203bc14eba455a790632468ca87d441aeb3d9812643: Status 404 returned error can't find the container with id ad972d9c1a28d8e0adf06203bc14eba455a790632468ca87d441aeb3d9812643 Apr 16 16:48:17.703794 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:17.703775 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:48:18.006366 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:18.006331 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"db43c9ac-2064-4f08-90ca-9c9258909279","Type":"ContainerStarted","Data":"b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25"} Apr 16 16:48:18.006366 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:18.006372 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"db43c9ac-2064-4f08-90ca-9c9258909279","Type":"ContainerStarted","Data":"ad972d9c1a28d8e0adf06203bc14eba455a790632468ca87d441aeb3d9812643"} Apr 16 16:48:23.027497 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:23.027434 2577 generic.go:358] "Generic (PLEG): container finished" podID="db43c9ac-2064-4f08-90ca-9c9258909279" containerID="b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25" exitCode=0 Apr 16 16:48:23.027987 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:23.027519 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"db43c9ac-2064-4f08-90ca-9c9258909279","Type":"ContainerDied","Data":"b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25"} Apr 16 16:48:24.033553 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:24.033512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"db43c9ac-2064-4f08-90ca-9c9258909279","Type":"ContainerStarted","Data":"d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac"} Apr 16 16:48:24.056083 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:24.056019 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=7.056003894 podStartE2EDuration="7.056003894s" podCreationTimestamp="2026-04-16 16:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:24.054306813 +0000 UTC m=+1533.019087310" watchObservedRunningTime="2026-04-16 16:48:24.056003894 +0000 UTC m=+1533.020784390" Apr 16 16:48:27.556056 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:27.556005 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:27.557988 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:27.557957 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:48:33.558558 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:33.558518 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk"] Apr 16 16:48:33.559839 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:33.559798 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" containerID="cri-o://bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1" gracePeriod=30 Apr 16 16:48:33.569790 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:33.569763 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg"] Apr 16 16:48:33.570178 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:33.570115 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" containerID="cri-o://f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4" gracePeriod=30 Apr 16 16:48:37.369010 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.368974 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9"] Apr 16 16:48:37.407368 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.407332 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq"] Apr 16 16:48:37.407564 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.407549 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.410421 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.410399 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-6pzh7\"" Apr 16 16:48:37.410736 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.410715 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 16:48:37.432088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.432047 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9"] Apr 16 16:48:37.432088 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.432077 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq"] Apr 16 16:48:37.432338 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.432177 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.542272 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g64b4\" (UniqueName: \"kubernetes.io/projected/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kube-api-access-g64b4\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.542272 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542274 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm859\" (UniqueName: \"kubernetes.io/projected/fb4372dd-6137-44bb-8d34-df64760a1c07-kube-api-access-xm859\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.542586 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-model-cache\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.542586 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542419 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.542586 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.542586 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542517 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.542586 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542579 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.542870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542608 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.542870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542632 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb4372dd-6137-44bb-8d34-df64760a1c07-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.542870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542658 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-home\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.542870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-dshm\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.542870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.542738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.556606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.556564 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:48:37.644023 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.643920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.644023 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.643969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.644023 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.643989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb4372dd-6137-44bb-8d34-df64760a1c07-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.644023 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-home\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.644367 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644031 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-dshm\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.644367 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.644367 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g64b4\" (UniqueName: \"kubernetes.io/projected/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kube-api-access-g64b4\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.644367 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm859\" (UniqueName: \"kubernetes.io/projected/fb4372dd-6137-44bb-8d34-df64760a1c07-kube-api-access-xm859\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.644367 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-model-cache\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.644367 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644168 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.644367 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.644367 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.644841 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644405 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.644841 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-home\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.644841 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.644841 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-model-cache\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.645056 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.644881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.645178 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.645147 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.646739 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.646717 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.647140 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.647079 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-dshm\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.647373 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.647354 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb4372dd-6137-44bb-8d34-df64760a1c07-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.647484 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.647432 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.653398 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.653368 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g64b4\" (UniqueName: \"kubernetes.io/projected/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kube-api-access-g64b4\") pod \"custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.653545 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.653377 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm859\" (UniqueName: \"kubernetes.io/projected/fb4372dd-6137-44bb-8d34-df64760a1c07-kube-api-access-xm859\") pod \"custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.718478 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.718406 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:37.742515 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.742479 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:37.884153 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.884124 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9"] Apr 16 16:48:37.885046 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:48:37.885014 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae45b28_e7cb_44f5_b1d8_5de775c5a699.slice/crio-8ec90da9d523f9107da41505cf9e61362aa8583785e57749d08da145f35c773a WatchSource:0}: Error finding container 8ec90da9d523f9107da41505cf9e61362aa8583785e57749d08da145f35c773a: Status 404 returned error can't find the container with id 8ec90da9d523f9107da41505cf9e61362aa8583785e57749d08da145f35c773a Apr 16 16:48:37.902791 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:37.902763 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq"] Apr 16 16:48:37.905360 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:48:37.905336 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb4372dd_6137_44bb_8d34_df64760a1c07.slice/crio-e3480f808ad8e44f8c413478727b7f6fb342bcbd21d711fd7cb01f39bf8fe8e8 WatchSource:0}: Error finding container e3480f808ad8e44f8c413478727b7f6fb342bcbd21d711fd7cb01f39bf8fe8e8: Status 404 returned error can't find the container with id e3480f808ad8e44f8c413478727b7f6fb342bcbd21d711fd7cb01f39bf8fe8e8 Apr 16 16:48:38.094435 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:38.094390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" event={"ID":"4ae45b28-e7cb-44f5-b1d8-5de775c5a699","Type":"ContainerStarted","Data":"fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb"} Apr 16 16:48:38.094435 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:38.094434 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" event={"ID":"4ae45b28-e7cb-44f5-b1d8-5de775c5a699","Type":"ContainerStarted","Data":"8ec90da9d523f9107da41505cf9e61362aa8583785e57749d08da145f35c773a"} Apr 16 16:48:38.094712 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:38.094537 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:38.096297 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:38.096259 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" event={"ID":"fb4372dd-6137-44bb-8d34-df64760a1c07","Type":"ContainerStarted","Data":"c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16"} Apr 16 16:48:38.096297 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:38.096294 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" event={"ID":"fb4372dd-6137-44bb-8d34-df64760a1c07","Type":"ContainerStarted","Data":"e3480f808ad8e44f8c413478727b7f6fb342bcbd21d711fd7cb01f39bf8fe8e8"} Apr 16 16:48:39.104056 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:39.104017 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" event={"ID":"4ae45b28-e7cb-44f5-b1d8-5de775c5a699","Type":"ContainerStarted","Data":"c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb"} Apr 16 16:48:42.624063 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.624027 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k_1b56fe14-94af-4d7e-8541-7468f7349e1e/main/0.log" Apr 16 16:48:42.624614 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.624586 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:48:42.700825 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.700789 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-home\") pod \"1b56fe14-94af-4d7e-8541-7468f7349e1e\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " Apr 16 16:48:42.701050 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.700867 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-kserve-provision-location\") pod \"1b56fe14-94af-4d7e-8541-7468f7349e1e\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " Apr 16 16:48:42.701050 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.700897 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b56fe14-94af-4d7e-8541-7468f7349e1e-tls-certs\") pod \"1b56fe14-94af-4d7e-8541-7468f7349e1e\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " Apr 16 16:48:42.701050 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.700920 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-model-cache\") pod \"1b56fe14-94af-4d7e-8541-7468f7349e1e\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " Apr 16 16:48:42.701050 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.700962 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smbgf\" (UniqueName: \"kubernetes.io/projected/1b56fe14-94af-4d7e-8541-7468f7349e1e-kube-api-access-smbgf\") pod \"1b56fe14-94af-4d7e-8541-7468f7349e1e\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " Apr 16 16:48:42.701050 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.701005 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-dshm\") pod \"1b56fe14-94af-4d7e-8541-7468f7349e1e\" (UID: \"1b56fe14-94af-4d7e-8541-7468f7349e1e\") " Apr 16 16:48:42.701340 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.701190 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-home" (OuterVolumeSpecName: "home") pod "1b56fe14-94af-4d7e-8541-7468f7349e1e" (UID: "1b56fe14-94af-4d7e-8541-7468f7349e1e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:48:42.701340 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.701312 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:48:42.701769 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.701742 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-model-cache" (OuterVolumeSpecName: "model-cache") pod "1b56fe14-94af-4d7e-8541-7468f7349e1e" (UID: "1b56fe14-94af-4d7e-8541-7468f7349e1e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:48:42.703708 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.703680 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b56fe14-94af-4d7e-8541-7468f7349e1e-kube-api-access-smbgf" (OuterVolumeSpecName: "kube-api-access-smbgf") pod "1b56fe14-94af-4d7e-8541-7468f7349e1e" (UID: "1b56fe14-94af-4d7e-8541-7468f7349e1e"). InnerVolumeSpecName "kube-api-access-smbgf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:48:42.704418 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.704394 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b56fe14-94af-4d7e-8541-7468f7349e1e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1b56fe14-94af-4d7e-8541-7468f7349e1e" (UID: "1b56fe14-94af-4d7e-8541-7468f7349e1e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:48:42.704528 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.704412 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-dshm" (OuterVolumeSpecName: "dshm") pod "1b56fe14-94af-4d7e-8541-7468f7349e1e" (UID: "1b56fe14-94af-4d7e-8541-7468f7349e1e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:48:42.749706 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.749669 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1b56fe14-94af-4d7e-8541-7468f7349e1e" (UID: "1b56fe14-94af-4d7e-8541-7468f7349e1e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:48:42.801851 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.801815 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:48:42.801851 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.801845 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b56fe14-94af-4d7e-8541-7468f7349e1e-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:48:42.801851 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.801856 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:48:42.802074 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.801869 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-smbgf\" (UniqueName: \"kubernetes.io/projected/1b56fe14-94af-4d7e-8541-7468f7349e1e-kube-api-access-smbgf\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:48:42.802074 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:42.801882 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1b56fe14-94af-4d7e-8541-7468f7349e1e-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:48:43.123364 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.123327 2577 generic.go:358] "Generic (PLEG): container finished" podID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerID="c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb" exitCode=0 Apr 16 16:48:43.123553 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.123402 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" event={"ID":"4ae45b28-e7cb-44f5-b1d8-5de775c5a699","Type":"ContainerDied","Data":"c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb"} Apr 16 16:48:43.124731 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.124710 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k_1b56fe14-94af-4d7e-8541-7468f7349e1e/main/0.log" Apr 16 16:48:43.125043 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.125018 2577 generic.go:358] "Generic (PLEG): container finished" podID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerID="6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa" exitCode=137 Apr 16 16:48:43.125274 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.125118 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" Apr 16 16:48:43.125274 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.125124 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" event={"ID":"1b56fe14-94af-4d7e-8541-7468f7349e1e","Type":"ContainerDied","Data":"6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa"} Apr 16 16:48:43.125274 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.125150 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k" event={"ID":"1b56fe14-94af-4d7e-8541-7468f7349e1e","Type":"ContainerDied","Data":"c95af642d75fa9101ef77249d2c7adba61e59ae2f1748f5a3a1388128e08784f"} Apr 16 16:48:43.125274 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.125172 2577 scope.go:117] "RemoveContainer" containerID="6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa" Apr 16 16:48:43.126791 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.126770 2577 generic.go:358] "Generic (PLEG): container finished" podID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerID="c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16" exitCode=0 Apr 16 16:48:43.126899 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.126832 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" event={"ID":"fb4372dd-6137-44bb-8d34-df64760a1c07","Type":"ContainerDied","Data":"c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16"} Apr 16 16:48:43.150263 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.150135 2577 scope.go:117] "RemoveContainer" containerID="7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805" Apr 16 16:48:43.206029 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.205836 2577 scope.go:117] "RemoveContainer" containerID="6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa" Apr 16 16:48:43.206641 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:48:43.206610 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa\": container with ID starting with 6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa not found: ID does not exist" containerID="6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa" Apr 16 16:48:43.206821 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.206792 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa"} err="failed to get container status \"6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa\": rpc error: code = NotFound desc = could not find container \"6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa\": container with ID starting with 6ca39f905b82430527e30481b8bc15f300724759904a1645144323272fb810aa not found: ID does not exist" Apr 16 16:48:43.206927 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.206912 2577 scope.go:117] "RemoveContainer" containerID="7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805" Apr 16 16:48:43.208049 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:48:43.208018 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805\": container with ID starting with 7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805 not found: ID does not exist" containerID="7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805" Apr 16 16:48:43.208159 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.208054 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805"} err="failed to get container status \"7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805\": rpc error: code = NotFound desc = could not find container \"7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805\": container with ID starting with 7072181b8d55b2216142a4365dc20d1cf9967bee4e5352261a7878d4b29ab805 not found: ID does not exist" Apr 16 16:48:43.209539 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.209519 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k"] Apr 16 16:48:43.217540 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.217516 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-88d57bfd-x8x5k"] Apr 16 16:48:43.653082 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:43.652998 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" path="/var/lib/kubelet/pods/1b56fe14-94af-4d7e-8541-7468f7349e1e/volumes" Apr 16 16:48:44.133516 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:44.133476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" event={"ID":"4ae45b28-e7cb-44f5-b1d8-5de775c5a699","Type":"ContainerStarted","Data":"4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283"} Apr 16 16:48:44.136422 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:44.136394 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" event={"ID":"fb4372dd-6137-44bb-8d34-df64760a1c07","Type":"ContainerStarted","Data":"c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb"} Apr 16 16:48:44.165223 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:44.165157 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podStartSLOduration=7.165136515 podStartE2EDuration="7.165136515s" podCreationTimestamp="2026-04-16 16:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:44.161623593 +0000 UTC m=+1553.126404090" watchObservedRunningTime="2026-04-16 16:48:44.165136515 +0000 UTC m=+1553.129917013" Apr 16 16:48:44.187215 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:44.187149 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podStartSLOduration=7.18712863 podStartE2EDuration="7.18712863s" podCreationTimestamp="2026-04-16 16:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:44.184011167 +0000 UTC m=+1553.148791690" watchObservedRunningTime="2026-04-16 16:48:44.18712863 +0000 UTC m=+1553.151909127" Apr 16 16:48:47.556312 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:47.556264 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:48:47.556956 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:47.556639 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:48:47.719192 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:47.719151 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:47.719368 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:47.719338 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:47.720591 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:47.720562 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:48:47.743183 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:47.743147 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:47.743386 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:47.743195 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:48:47.744728 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:47.744693 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:48:48.173535 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:48.173506 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:48:57.556572 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:57.556510 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:48:57.719763 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:57.719711 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:48:57.743182 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:48:57.743132 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:49:03.570576 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:03.570518 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="llm-d-routing-sidecar" containerID="cri-o://451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9" gracePeriod=2 Apr 16 16:49:03.889046 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:03.889014 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:49:03.927011 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:03.926983 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg_239f5ac8-ab81-4824-9309-d7950f9dc58f/main/0.log" Apr 16 16:49:03.927930 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:03.927903 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:49:04.032105 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032067 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-home\") pod \"e5485249-290d-42c8-b274-6111f1454a7f\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " Apr 16 16:49:04.032105 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032121 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5485249-290d-42c8-b274-6111f1454a7f-tls-certs\") pod \"e5485249-290d-42c8-b274-6111f1454a7f\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " Apr 16 16:49:04.032376 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032148 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-dshm\") pod \"239f5ac8-ab81-4824-9309-d7950f9dc58f\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " Apr 16 16:49:04.032376 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032183 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kkzc\" (UniqueName: \"kubernetes.io/projected/239f5ac8-ab81-4824-9309-d7950f9dc58f-kube-api-access-2kkzc\") pod \"239f5ac8-ab81-4824-9309-d7950f9dc58f\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " Apr 16 16:49:04.032376 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032209 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-kserve-provision-location\") pod \"239f5ac8-ab81-4824-9309-d7950f9dc58f\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " Apr 16 16:49:04.032376 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032260 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-kserve-provision-location\") pod \"e5485249-290d-42c8-b274-6111f1454a7f\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " Apr 16 16:49:04.032376 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032280 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-dshm\") pod \"e5485249-290d-42c8-b274-6111f1454a7f\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " Apr 16 16:49:04.032376 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032309 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/239f5ac8-ab81-4824-9309-d7950f9dc58f-tls-certs\") pod \"239f5ac8-ab81-4824-9309-d7950f9dc58f\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " Apr 16 16:49:04.032697 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032378 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-model-cache\") pod \"239f5ac8-ab81-4824-9309-d7950f9dc58f\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " Apr 16 16:49:04.032697 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032407 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-model-cache\") pod \"e5485249-290d-42c8-b274-6111f1454a7f\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " Apr 16 16:49:04.032697 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032480 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b27b\" (UniqueName: \"kubernetes.io/projected/e5485249-290d-42c8-b274-6111f1454a7f-kube-api-access-7b27b\") pod \"e5485249-290d-42c8-b274-6111f1454a7f\" (UID: \"e5485249-290d-42c8-b274-6111f1454a7f\") " Apr 16 16:49:04.032697 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032517 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-home\") pod \"239f5ac8-ab81-4824-9309-d7950f9dc58f\" (UID: \"239f5ac8-ab81-4824-9309-d7950f9dc58f\") " Apr 16 16:49:04.032697 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032534 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-home" (OuterVolumeSpecName: "home") pod "e5485249-290d-42c8-b274-6111f1454a7f" (UID: "e5485249-290d-42c8-b274-6111f1454a7f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:49:04.032943 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.032858 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.033619 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.033199 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-model-cache" (OuterVolumeSpecName: "model-cache") pod "239f5ac8-ab81-4824-9309-d7950f9dc58f" (UID: "239f5ac8-ab81-4824-9309-d7950f9dc58f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:49:04.033619 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.033279 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-home" (OuterVolumeSpecName: "home") pod "239f5ac8-ab81-4824-9309-d7950f9dc58f" (UID: "239f5ac8-ab81-4824-9309-d7950f9dc58f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:49:04.033619 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.033359 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-model-cache" (OuterVolumeSpecName: "model-cache") pod "e5485249-290d-42c8-b274-6111f1454a7f" (UID: "e5485249-290d-42c8-b274-6111f1454a7f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:49:04.036353 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.036183 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5485249-290d-42c8-b274-6111f1454a7f-kube-api-access-7b27b" (OuterVolumeSpecName: "kube-api-access-7b27b") pod "e5485249-290d-42c8-b274-6111f1454a7f" (UID: "e5485249-290d-42c8-b274-6111f1454a7f"). InnerVolumeSpecName "kube-api-access-7b27b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:49:04.036353 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.036316 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/239f5ac8-ab81-4824-9309-d7950f9dc58f-kube-api-access-2kkzc" (OuterVolumeSpecName: "kube-api-access-2kkzc") pod "239f5ac8-ab81-4824-9309-d7950f9dc58f" (UID: "239f5ac8-ab81-4824-9309-d7950f9dc58f"). InnerVolumeSpecName "kube-api-access-2kkzc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:49:04.038089 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.038054 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-dshm" (OuterVolumeSpecName: "dshm") pod "239f5ac8-ab81-4824-9309-d7950f9dc58f" (UID: "239f5ac8-ab81-4824-9309-d7950f9dc58f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:49:04.038318 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.038286 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-dshm" (OuterVolumeSpecName: "dshm") pod "e5485249-290d-42c8-b274-6111f1454a7f" (UID: "e5485249-290d-42c8-b274-6111f1454a7f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:49:04.038888 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.038853 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5485249-290d-42c8-b274-6111f1454a7f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e5485249-290d-42c8-b274-6111f1454a7f" (UID: "e5485249-290d-42c8-b274-6111f1454a7f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:49:04.039989 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.039954 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239f5ac8-ab81-4824-9309-d7950f9dc58f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "239f5ac8-ab81-4824-9309-d7950f9dc58f" (UID: "239f5ac8-ab81-4824-9309-d7950f9dc58f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:49:04.074438 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.074370 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e5485249-290d-42c8-b274-6111f1454a7f" (UID: "e5485249-290d-42c8-b274-6111f1454a7f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:49:04.120872 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.120777 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "239f5ac8-ab81-4824-9309-d7950f9dc58f" (UID: "239f5ac8-ab81-4824-9309-d7950f9dc58f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:49:04.133877 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.133838 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7b27b\" (UniqueName: \"kubernetes.io/projected/e5485249-290d-42c8-b274-6111f1454a7f-kube-api-access-7b27b\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.133877 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.133883 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.134077 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.133901 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5485249-290d-42c8-b274-6111f1454a7f-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.134077 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.133916 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.134077 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.133931 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2kkzc\" (UniqueName: \"kubernetes.io/projected/239f5ac8-ab81-4824-9309-d7950f9dc58f-kube-api-access-2kkzc\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.134077 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.133944 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.134077 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.133960 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.134077 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.133973 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.134077 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.133988 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/239f5ac8-ab81-4824-9309-d7950f9dc58f-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.134077 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.134001 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/239f5ac8-ab81-4824-9309-d7950f9dc58f-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.134077 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.134013 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5485249-290d-42c8-b274-6111f1454a7f-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:49:04.225266 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.225234 2577 generic.go:358] "Generic (PLEG): container finished" podID="e5485249-290d-42c8-b274-6111f1454a7f" containerID="bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1" exitCode=137 Apr 16 16:49:04.225480 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.225321 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" Apr 16 16:49:04.225480 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.225319 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" event={"ID":"e5485249-290d-42c8-b274-6111f1454a7f","Type":"ContainerDied","Data":"bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1"} Apr 16 16:49:04.225480 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.225369 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk" event={"ID":"e5485249-290d-42c8-b274-6111f1454a7f","Type":"ContainerDied","Data":"0d52d299d4a4b70d7995dfdbb73a43288d7fd13c08472bb110684e42eaf758c7"} Apr 16 16:49:04.225480 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.225392 2577 scope.go:117] "RemoveContainer" containerID="bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1" Apr 16 16:49:04.227519 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.227501 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg_239f5ac8-ab81-4824-9309-d7950f9dc58f/main/0.log" Apr 16 16:49:04.228226 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.228203 2577 generic.go:358] "Generic (PLEG): container finished" podID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerID="f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4" exitCode=137 Apr 16 16:49:04.228344 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.228241 2577 generic.go:358] "Generic (PLEG): container finished" podID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerID="451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9" exitCode=0 Apr 16 16:49:04.228344 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.228311 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" Apr 16 16:49:04.228479 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.228353 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" event={"ID":"239f5ac8-ab81-4824-9309-d7950f9dc58f","Type":"ContainerDied","Data":"f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4"} Apr 16 16:49:04.228479 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.228400 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" event={"ID":"239f5ac8-ab81-4824-9309-d7950f9dc58f","Type":"ContainerDied","Data":"451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9"} Apr 16 16:49:04.228479 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.228417 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg" event={"ID":"239f5ac8-ab81-4824-9309-d7950f9dc58f","Type":"ContainerDied","Data":"8bf1b952707893146c85140fefb61a1efa73067b151063d5096b16223c34571c"} Apr 16 16:49:04.255410 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.255353 2577 scope.go:117] "RemoveContainer" containerID="58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb" Apr 16 16:49:04.257066 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.256936 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk"] Apr 16 16:49:04.262245 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.262208 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6d5xjpk"] Apr 16 16:49:04.272843 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.272801 2577 scope.go:117] "RemoveContainer" containerID="bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1" Apr 16 16:49:04.273497 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:49:04.273208 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1\": container with ID starting with bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1 not found: ID does not exist" containerID="bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1" Apr 16 16:49:04.273497 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.273246 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1"} err="failed to get container status \"bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1\": rpc error: code = NotFound desc = could not find container \"bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1\": container with ID starting with bd79469e73ea608c2601b2ac996d4052235075d3f6834a701377ddbc4867a5e1 not found: ID does not exist" Apr 16 16:49:04.273497 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.273276 2577 scope.go:117] "RemoveContainer" containerID="58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb" Apr 16 16:49:04.273765 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:49:04.273551 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb\": container with ID starting with 58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb not found: ID does not exist" containerID="58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb" Apr 16 16:49:04.273765 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.273576 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb"} err="failed to get container status \"58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb\": rpc error: code = NotFound desc = could not find container \"58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb\": container with ID starting with 58fe4d30b28fbbbc689bf6b3d54c4a06861988b8251fd6dfb98a14137e8bbfdb not found: ID does not exist" Apr 16 16:49:04.273765 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.273596 2577 scope.go:117] "RemoveContainer" containerID="f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4" Apr 16 16:49:04.275856 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.275833 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg"] Apr 16 16:49:04.283880 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.283834 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8547b4b7ccrvpvg"] Apr 16 16:49:04.286209 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.286181 2577 scope.go:117] "RemoveContainer" containerID="a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569" Apr 16 16:49:04.300323 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.300295 2577 scope.go:117] "RemoveContainer" containerID="451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9" Apr 16 16:49:04.310965 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.310941 2577 scope.go:117] "RemoveContainer" containerID="f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4" Apr 16 16:49:04.311423 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:49:04.311395 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4\": container with ID starting with f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4 not found: ID does not exist" containerID="f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4" Apr 16 16:49:04.311581 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.311427 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4"} err="failed to get container status \"f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4\": rpc error: code = NotFound desc = could not find container \"f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4\": container with ID starting with f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4 not found: ID does not exist" Apr 16 16:49:04.311581 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.311469 2577 scope.go:117] "RemoveContainer" containerID="a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569" Apr 16 16:49:04.311824 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:49:04.311797 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569\": container with ID starting with a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569 not found: ID does not exist" containerID="a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569" Apr 16 16:49:04.311963 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.311930 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569"} err="failed to get container status \"a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569\": rpc error: code = NotFound desc = could not find container \"a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569\": container with ID starting with a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569 not found: ID does not exist" Apr 16 16:49:04.312060 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.311966 2577 scope.go:117] "RemoveContainer" containerID="451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9" Apr 16 16:49:04.312290 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:49:04.312269 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9\": container with ID starting with 451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9 not found: ID does not exist" containerID="451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9" Apr 16 16:49:04.312290 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.312293 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9"} err="failed to get container status \"451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9\": rpc error: code = NotFound desc = could not find container \"451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9\": container with ID starting with 451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9 not found: ID does not exist" Apr 16 16:49:04.312537 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.312310 2577 scope.go:117] "RemoveContainer" containerID="f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4" Apr 16 16:49:04.312655 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.312625 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4"} err="failed to get container status \"f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4\": rpc error: code = NotFound desc = could not find container \"f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4\": container with ID starting with f0021c97ca58721ca2bd813f64ebb0acb7de38b3e3488a5e458b5d5cbbec99f4 not found: ID does not exist" Apr 16 16:49:04.312729 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.312657 2577 scope.go:117] "RemoveContainer" containerID="a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569" Apr 16 16:49:04.312922 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.312900 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569"} err="failed to get container status \"a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569\": rpc error: code = NotFound desc = could not find container \"a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569\": container with ID starting with a0f0f663e07b57e100a2e3b31b3fbd308f2f23f8adbcf2080947f5a460c5e569 not found: ID does not exist" Apr 16 16:49:04.313005 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.312924 2577 scope.go:117] "RemoveContainer" containerID="451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9" Apr 16 16:49:04.313202 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:04.313171 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9"} err="failed to get container status \"451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9\": rpc error: code = NotFound desc = could not find container \"451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9\": container with ID starting with 451c9c7035a7f3c1ccc9386bda61ad0f325a351d0ba6ba21b6dbf6f8aa4e73c9 not found: ID does not exist" Apr 16 16:49:05.652747 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:05.652707 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" path="/var/lib/kubelet/pods/239f5ac8-ab81-4824-9309-d7950f9dc58f/volumes" Apr 16 16:49:05.653478 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:05.653457 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5485249-290d-42c8-b274-6111f1454a7f" path="/var/lib/kubelet/pods/e5485249-290d-42c8-b274-6111f1454a7f/volumes" Apr 16 16:49:07.556723 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:07.556664 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:49:07.719770 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:07.719721 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:49:07.743130 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:07.743081 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:49:17.556811 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:17.556752 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:49:17.719199 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:17.719150 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:49:17.743873 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:17.743826 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:49:27.555794 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:27.555730 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:49:27.719144 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:27.719087 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:49:27.743566 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:27.743517 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:49:37.556484 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:37.556404 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:49:37.719599 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:37.719546 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:49:37.743851 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:37.743797 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:49:47.555983 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:47.555931 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:49:47.719111 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:47.719061 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:49:47.743835 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:47.743793 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:49:57.556231 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:57.556182 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:49:57.719528 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:57.719472 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:49:57.743438 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:49:57.743398 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:50:07.555933 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:07.555879 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:50:07.719077 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:07.719036 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:50:07.743496 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:07.743440 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:50:17.556407 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:17.556353 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:50:17.719738 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:17.719695 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:50:17.743872 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:17.743828 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:50:27.555735 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:27.555682 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:50:27.718911 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:27.718848 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:50:27.743673 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:27.743635 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:50:37.555842 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:37.555795 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:50:37.719031 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:37.718974 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:50:37.742969 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:37.742924 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:50:47.556798 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:47.556744 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 16:50:47.719457 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:47.719395 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:50:47.743030 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:47.742990 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:50:57.566278 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:57.566237 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:50:57.574263 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:57.574230 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:50:57.719917 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:57.719874 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:50:57.743432 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:50:57.743391 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:51:07.718950 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:07.718899 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:51:07.743628 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:07.743576 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:51:13.980565 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:13.980472 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 16:51:13.981007 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:13.980812 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" containerID="cri-o://d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac" gracePeriod=30 Apr 16 16:51:15.441970 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.441946 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:51:15.550069 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.549981 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-model-cache\") pod \"db43c9ac-2064-4f08-90ca-9c9258909279\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " Apr 16 16:51:15.550069 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.550026 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-kserve-provision-location\") pod \"db43c9ac-2064-4f08-90ca-9c9258909279\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " Apr 16 16:51:15.550308 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.550130 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-home\") pod \"db43c9ac-2064-4f08-90ca-9c9258909279\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " Apr 16 16:51:15.550308 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.550155 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx6bt\" (UniqueName: \"kubernetes.io/projected/db43c9ac-2064-4f08-90ca-9c9258909279-kube-api-access-kx6bt\") pod \"db43c9ac-2064-4f08-90ca-9c9258909279\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " Apr 16 16:51:15.550308 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.550194 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/db43c9ac-2064-4f08-90ca-9c9258909279-tls-certs\") pod \"db43c9ac-2064-4f08-90ca-9c9258909279\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " Apr 16 16:51:15.550308 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.550240 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-dshm\") pod \"db43c9ac-2064-4f08-90ca-9c9258909279\" (UID: \"db43c9ac-2064-4f08-90ca-9c9258909279\") " Apr 16 16:51:15.550550 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.550284 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-model-cache" (OuterVolumeSpecName: "model-cache") pod "db43c9ac-2064-4f08-90ca-9c9258909279" (UID: "db43c9ac-2064-4f08-90ca-9c9258909279"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:15.550607 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.550558 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:51:15.550607 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.550578 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-home" (OuterVolumeSpecName: "home") pod "db43c9ac-2064-4f08-90ca-9c9258909279" (UID: "db43c9ac-2064-4f08-90ca-9c9258909279"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:15.552501 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.552460 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-dshm" (OuterVolumeSpecName: "dshm") pod "db43c9ac-2064-4f08-90ca-9c9258909279" (UID: "db43c9ac-2064-4f08-90ca-9c9258909279"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:15.552501 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.552479 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db43c9ac-2064-4f08-90ca-9c9258909279-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "db43c9ac-2064-4f08-90ca-9c9258909279" (UID: "db43c9ac-2064-4f08-90ca-9c9258909279"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:15.552724 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.552632 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db43c9ac-2064-4f08-90ca-9c9258909279-kube-api-access-kx6bt" (OuterVolumeSpecName: "kube-api-access-kx6bt") pod "db43c9ac-2064-4f08-90ca-9c9258909279" (UID: "db43c9ac-2064-4f08-90ca-9c9258909279"). InnerVolumeSpecName "kube-api-access-kx6bt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:51:15.602227 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.602167 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "db43c9ac-2064-4f08-90ca-9c9258909279" (UID: "db43c9ac-2064-4f08-90ca-9c9258909279"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:15.652677 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.652639 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/db43c9ac-2064-4f08-90ca-9c9258909279-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:51:15.652677 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.652686 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:51:15.652947 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.652701 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:51:15.652947 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.652714 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/db43c9ac-2064-4f08-90ca-9c9258909279-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:51:15.652947 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.652729 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kx6bt\" (UniqueName: \"kubernetes.io/projected/db43c9ac-2064-4f08-90ca-9c9258909279-kube-api-access-kx6bt\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:51:15.748591 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.748555 2577 generic.go:358] "Generic (PLEG): container finished" podID="db43c9ac-2064-4f08-90ca-9c9258909279" containerID="d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac" exitCode=0 Apr 16 16:51:15.748801 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.748658 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"db43c9ac-2064-4f08-90ca-9c9258909279","Type":"ContainerDied","Data":"d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac"} Apr 16 16:51:15.748801 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.748694 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 16:51:15.748801 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.748709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"db43c9ac-2064-4f08-90ca-9c9258909279","Type":"ContainerDied","Data":"ad972d9c1a28d8e0adf06203bc14eba455a790632468ca87d441aeb3d9812643"} Apr 16 16:51:15.748801 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.748730 2577 scope.go:117] "RemoveContainer" containerID="d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac" Apr 16 16:51:15.766170 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.766130 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 16:51:15.769085 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.769065 2577 scope.go:117] "RemoveContainer" containerID="b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25" Apr 16 16:51:15.772709 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.772683 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 16:51:15.827099 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.827072 2577 scope.go:117] "RemoveContainer" containerID="d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac" Apr 16 16:51:15.827505 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:51:15.827481 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac\": container with ID starting with d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac not found: ID does not exist" containerID="d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac" Apr 16 16:51:15.827612 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.827516 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac"} err="failed to get container status \"d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac\": rpc error: code = NotFound desc = could not find container \"d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac\": container with ID starting with d353dab6e4dfd0b73a4a3e81e1379d2d1b411ea8529e33d6006efdc17ee4aeac not found: ID does not exist" Apr 16 16:51:15.827612 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.827538 2577 scope.go:117] "RemoveContainer" containerID="b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25" Apr 16 16:51:15.827859 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:51:15.827838 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25\": container with ID starting with b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25 not found: ID does not exist" containerID="b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25" Apr 16 16:51:15.827904 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:15.827867 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25"} err="failed to get container status \"b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25\": rpc error: code = NotFound desc = could not find container \"b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25\": container with ID starting with b762b0a37d2051a898155744e1cd158a8e88494bc75b6b201fafe6f9ffba3b25 not found: ID does not exist" Apr 16 16:51:17.652875 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:17.652843 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" path="/var/lib/kubelet/pods/db43c9ac-2064-4f08-90ca-9c9258909279/volumes" Apr 16 16:51:17.719268 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:17.719226 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:51:17.743597 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:17.743557 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:51:27.719584 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:27.719529 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 16:51:27.743950 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:27.743903 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 16:51:37.729601 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:37.729561 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:51:37.741489 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:37.741465 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:51:37.754107 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:37.754075 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:51:37.762625 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:51:37.762596 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:52:15.348705 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:15.348652 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq"] Apr 16 16:52:15.349278 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:15.349024 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" containerID="cri-o://c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb" gracePeriod=30 Apr 16 16:52:15.357696 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:15.357667 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9"] Apr 16 16:52:15.358069 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:15.358034 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" containerID="cri-o://4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283" gracePeriod=30 Apr 16 16:52:45.358122 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.358057 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="llm-d-routing-sidecar" containerID="cri-o://fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb" gracePeriod=2 Apr 16 16:52:45.802854 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.802825 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9_4ae45b28-e7cb-44f5-b1d8-5de775c5a699/main/0.log" Apr 16 16:52:45.803547 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.803528 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:52:45.806292 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.806272 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:52:45.917054 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917015 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-dshm\") pod \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " Apr 16 16:52:45.917234 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917066 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-model-cache\") pod \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " Apr 16 16:52:45.917234 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917101 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm859\" (UniqueName: \"kubernetes.io/projected/fb4372dd-6137-44bb-8d34-df64760a1c07-kube-api-access-xm859\") pod \"fb4372dd-6137-44bb-8d34-df64760a1c07\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " Apr 16 16:52:45.917234 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917118 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g64b4\" (UniqueName: \"kubernetes.io/projected/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kube-api-access-g64b4\") pod \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " Apr 16 16:52:45.917234 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917193 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-tls-certs\") pod \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " Apr 16 16:52:45.917234 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917219 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-home\") pod \"fb4372dd-6137-44bb-8d34-df64760a1c07\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " Apr 16 16:52:45.917571 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917263 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-dshm\") pod \"fb4372dd-6137-44bb-8d34-df64760a1c07\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " Apr 16 16:52:45.917571 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917289 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb4372dd-6137-44bb-8d34-df64760a1c07-tls-certs\") pod \"fb4372dd-6137-44bb-8d34-df64760a1c07\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " Apr 16 16:52:45.917571 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917334 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-model-cache\") pod \"fb4372dd-6137-44bb-8d34-df64760a1c07\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " Apr 16 16:52:45.917571 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917363 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-kserve-provision-location\") pod \"fb4372dd-6137-44bb-8d34-df64760a1c07\" (UID: \"fb4372dd-6137-44bb-8d34-df64760a1c07\") " Apr 16 16:52:45.917571 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917395 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kserve-provision-location\") pod \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " Apr 16 16:52:45.917571 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917457 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-home\") pod \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\" (UID: \"4ae45b28-e7cb-44f5-b1d8-5de775c5a699\") " Apr 16 16:52:45.917571 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917471 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-model-cache" (OuterVolumeSpecName: "model-cache") pod "4ae45b28-e7cb-44f5-b1d8-5de775c5a699" (UID: "4ae45b28-e7cb-44f5-b1d8-5de775c5a699"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:45.917949 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917707 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-model-cache" (OuterVolumeSpecName: "model-cache") pod "fb4372dd-6137-44bb-8d34-df64760a1c07" (UID: "fb4372dd-6137-44bb-8d34-df64760a1c07"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:45.917949 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917742 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-home" (OuterVolumeSpecName: "home") pod "fb4372dd-6137-44bb-8d34-df64760a1c07" (UID: "fb4372dd-6137-44bb-8d34-df64760a1c07"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:45.917949 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.917899 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-home" (OuterVolumeSpecName: "home") pod "4ae45b28-e7cb-44f5-b1d8-5de775c5a699" (UID: "4ae45b28-e7cb-44f5-b1d8-5de775c5a699"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:45.918111 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.918012 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:45.918111 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.918031 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:45.918111 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.918046 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-home\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:45.918111 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.918060 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-model-cache\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:45.919968 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.919903 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-dshm" (OuterVolumeSpecName: "dshm") pod "4ae45b28-e7cb-44f5-b1d8-5de775c5a699" (UID: "4ae45b28-e7cb-44f5-b1d8-5de775c5a699"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:45.919968 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.919914 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb4372dd-6137-44bb-8d34-df64760a1c07-kube-api-access-xm859" (OuterVolumeSpecName: "kube-api-access-xm859") pod "fb4372dd-6137-44bb-8d34-df64760a1c07" (UID: "fb4372dd-6137-44bb-8d34-df64760a1c07"). InnerVolumeSpecName "kube-api-access-xm859". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:52:45.919968 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.919902 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4ae45b28-e7cb-44f5-b1d8-5de775c5a699" (UID: "4ae45b28-e7cb-44f5-b1d8-5de775c5a699"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:45.920191 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.920003 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kube-api-access-g64b4" (OuterVolumeSpecName: "kube-api-access-g64b4") pod "4ae45b28-e7cb-44f5-b1d8-5de775c5a699" (UID: "4ae45b28-e7cb-44f5-b1d8-5de775c5a699"). InnerVolumeSpecName "kube-api-access-g64b4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:52:45.920191 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.920095 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-dshm" (OuterVolumeSpecName: "dshm") pod "fb4372dd-6137-44bb-8d34-df64760a1c07" (UID: "fb4372dd-6137-44bb-8d34-df64760a1c07"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:45.920191 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.920169 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb4372dd-6137-44bb-8d34-df64760a1c07-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fb4372dd-6137-44bb-8d34-df64760a1c07" (UID: "fb4372dd-6137-44bb-8d34-df64760a1c07"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:45.988929 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.988829 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fb4372dd-6137-44bb-8d34-df64760a1c07" (UID: "fb4372dd-6137-44bb-8d34-df64760a1c07"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:45.989327 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:45.989294 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ae45b28-e7cb-44f5-b1d8-5de775c5a699" (UID: "4ae45b28-e7cb-44f5-b1d8-5de775c5a699"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:46.018885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.018851 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:46.018885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.018883 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xm859\" (UniqueName: \"kubernetes.io/projected/fb4372dd-6137-44bb-8d34-df64760a1c07-kube-api-access-xm859\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:46.019027 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.018895 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g64b4\" (UniqueName: \"kubernetes.io/projected/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kube-api-access-g64b4\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:46.019027 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.018905 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:46.019027 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.018914 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-dshm\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:46.019027 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.018922 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb4372dd-6137-44bb-8d34-df64760a1c07-tls-certs\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:46.019027 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.018930 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb4372dd-6137-44bb-8d34-df64760a1c07-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:46.019027 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.018939 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae45b28-e7cb-44f5-b1d8-5de775c5a699-kserve-provision-location\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:52:46.076529 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.076490 2577 generic.go:358] "Generic (PLEG): container finished" podID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerID="c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb" exitCode=137 Apr 16 16:52:46.076739 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.076589 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" Apr 16 16:52:46.076739 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.076586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" event={"ID":"fb4372dd-6137-44bb-8d34-df64760a1c07","Type":"ContainerDied","Data":"c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb"} Apr 16 16:52:46.076739 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.076628 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq" event={"ID":"fb4372dd-6137-44bb-8d34-df64760a1c07","Type":"ContainerDied","Data":"e3480f808ad8e44f8c413478727b7f6fb342bcbd21d711fd7cb01f39bf8fe8e8"} Apr 16 16:52:46.076739 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.076645 2577 scope.go:117] "RemoveContainer" containerID="c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb" Apr 16 16:52:46.077944 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.077925 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9_4ae45b28-e7cb-44f5-b1d8-5de775c5a699/main/0.log" Apr 16 16:52:46.078665 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.078636 2577 generic.go:358] "Generic (PLEG): container finished" podID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerID="4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283" exitCode=137 Apr 16 16:52:46.078665 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.078662 2577 generic.go:358] "Generic (PLEG): container finished" podID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerID="fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb" exitCode=0 Apr 16 16:52:46.078822 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.078673 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" event={"ID":"4ae45b28-e7cb-44f5-b1d8-5de775c5a699","Type":"ContainerDied","Data":"4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283"} Apr 16 16:52:46.078822 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.078711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" event={"ID":"4ae45b28-e7cb-44f5-b1d8-5de775c5a699","Type":"ContainerDied","Data":"fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb"} Apr 16 16:52:46.078822 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.078727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" event={"ID":"4ae45b28-e7cb-44f5-b1d8-5de775c5a699","Type":"ContainerDied","Data":"8ec90da9d523f9107da41505cf9e61362aa8583785e57749d08da145f35c773a"} Apr 16 16:52:46.078822 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.078756 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9" Apr 16 16:52:46.099126 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.099105 2577 scope.go:117] "RemoveContainer" containerID="c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16" Apr 16 16:52:46.105164 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.105138 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq"] Apr 16 16:52:46.110516 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.110479 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-55c8678f5c-k4nbq"] Apr 16 16:52:46.111401 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.111386 2577 scope.go:117] "RemoveContainer" containerID="c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb" Apr 16 16:52:46.111785 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:52:46.111747 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb\": container with ID starting with c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb not found: ID does not exist" containerID="c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb" Apr 16 16:52:46.111870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.111796 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb"} err="failed to get container status \"c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb\": rpc error: code = NotFound desc = could not find container \"c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb\": container with ID starting with c184ae8cee5af4d6995e72f74094900de4715ee9cf64491ea9ee646b6402c1fb not found: ID does not exist" Apr 16 16:52:46.111870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.111819 2577 scope.go:117] "RemoveContainer" containerID="c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16" Apr 16 16:52:46.112074 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:52:46.112051 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16\": container with ID starting with c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16 not found: ID does not exist" containerID="c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16" Apr 16 16:52:46.112128 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.112080 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16"} err="failed to get container status \"c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16\": rpc error: code = NotFound desc = could not find container \"c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16\": container with ID starting with c0516bfd9209775710d10e61fdac1a022dad845de439c78f076da629891cbb16 not found: ID does not exist" Apr 16 16:52:46.112128 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.112095 2577 scope.go:117] "RemoveContainer" containerID="4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283" Apr 16 16:52:46.121192 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.120837 2577 scope.go:117] "RemoveContainer" containerID="c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb" Apr 16 16:52:46.124558 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.124533 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9"] Apr 16 16:52:46.127292 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.127269 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-58fbddc65d-2b7r9"] Apr 16 16:52:46.134306 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.134284 2577 scope.go:117] "RemoveContainer" containerID="fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb" Apr 16 16:52:46.143810 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.143787 2577 scope.go:117] "RemoveContainer" containerID="4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283" Apr 16 16:52:46.144132 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:52:46.144103 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283\": container with ID starting with 4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283 not found: ID does not exist" containerID="4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283" Apr 16 16:52:46.144194 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.144138 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283"} err="failed to get container status \"4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283\": rpc error: code = NotFound desc = could not find container \"4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283\": container with ID starting with 4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283 not found: ID does not exist" Apr 16 16:52:46.144194 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.144160 2577 scope.go:117] "RemoveContainer" containerID="c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb" Apr 16 16:52:46.144453 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:52:46.144422 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb\": container with ID starting with c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb not found: ID does not exist" containerID="c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb" Apr 16 16:52:46.144496 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.144460 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb"} err="failed to get container status \"c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb\": rpc error: code = NotFound desc = could not find container \"c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb\": container with ID starting with c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb not found: ID does not exist" Apr 16 16:52:46.144496 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.144483 2577 scope.go:117] "RemoveContainer" containerID="fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb" Apr 16 16:52:46.144707 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:52:46.144686 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb\": container with ID starting with fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb not found: ID does not exist" containerID="fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb" Apr 16 16:52:46.144752 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.144712 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb"} err="failed to get container status \"fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb\": rpc error: code = NotFound desc = could not find container \"fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb\": container with ID starting with fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb not found: ID does not exist" Apr 16 16:52:46.144752 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.144725 2577 scope.go:117] "RemoveContainer" containerID="4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283" Apr 16 16:52:46.144951 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.144935 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283"} err="failed to get container status \"4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283\": rpc error: code = NotFound desc = could not find container \"4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283\": container with ID starting with 4d691db287ad1787bab8603605eb02e8ec177fc9c200d23d43b41194e0fe5283 not found: ID does not exist" Apr 16 16:52:46.144951 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.144951 2577 scope.go:117] "RemoveContainer" containerID="c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb" Apr 16 16:52:46.145149 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.145130 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb"} err="failed to get container status \"c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb\": rpc error: code = NotFound desc = could not find container \"c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb\": container with ID starting with c58f03b91e4c42c027d320e05c447e4e56b8379d88eaf72a66fd982a32769efb not found: ID does not exist" Apr 16 16:52:46.145193 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.145149 2577 scope.go:117] "RemoveContainer" containerID="fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb" Apr 16 16:52:46.145364 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:46.145345 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb"} err="failed to get container status \"fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb\": rpc error: code = NotFound desc = could not find container \"fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb\": container with ID starting with fb727e75ec675b69bf02c5644df5ff4e901cc1078d674c0f66354c90aff963cb not found: ID does not exist" Apr 16 16:52:47.651325 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:47.651287 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" path="/var/lib/kubelet/pods/4ae45b28-e7cb-44f5-b1d8-5de775c5a699/volumes" Apr 16 16:52:47.651805 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:47.651790 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" path="/var/lib/kubelet/pods/fb4372dd-6137-44bb-8d34-df64760a1c07/volumes" Apr 16 16:52:51.669562 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:51.669528 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:52:51.671376 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:52:51.671353 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hschh_652350aa-d2fc-4c32-bc1b-e593db927908/ovn-acl-logging/0.log" Apr 16 16:54:57.596896 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.596856 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zvtqj/must-gather-48kgv"] Apr 16 16:54:57.597429 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597366 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="storage-initializer" Apr 16 16:54:57.597429 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597386 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="storage-initializer" Apr 16 16:54:57.597429 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597401 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" Apr 16 16:54:57.597429 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597411 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" Apr 16 16:54:57.597429 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597421 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" Apr 16 16:54:57.597429 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597429 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597458 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="storage-initializer" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597467 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="storage-initializer" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597484 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597493 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597503 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="storage-initializer" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597511 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="storage-initializer" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597523 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597531 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597542 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597550 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597559 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597567 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597582 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="llm-d-routing-sidecar" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597590 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="llm-d-routing-sidecar" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597604 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="llm-d-routing-sidecar" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597612 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="llm-d-routing-sidecar" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597625 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="storage-initializer" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597633 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="storage-initializer" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597642 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="storage-initializer" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597651 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="storage-initializer" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597659 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="storage-initializer" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597668 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="storage-initializer" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597758 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="llm-d-routing-sidecar" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597772 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5485249-290d-42c8-b274-6111f1454a7f" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597785 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="db43c9ac-2064-4f08-90ca-9c9258909279" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597798 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb4372dd-6137-44bb-8d34-df64760a1c07" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597809 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="llm-d-routing-sidecar" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597820 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="239f5ac8-ab81-4824-9309-d7950f9dc58f" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597829 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ae45b28-e7cb-44f5-b1d8-5de775c5a699" containerName="main" Apr 16 16:54:57.597846 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.597839 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b56fe14-94af-4d7e-8541-7468f7349e1e" containerName="main" Apr 16 16:54:57.601158 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.601137 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvtqj/must-gather-48kgv" Apr 16 16:54:57.603746 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.603721 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zvtqj\"/\"kube-root-ca.crt\"" Apr 16 16:54:57.603870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.603721 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zvtqj\"/\"default-dockercfg-sbfg8\"" Apr 16 16:54:57.603870 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.603748 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zvtqj\"/\"openshift-service-ca.crt\"" Apr 16 16:54:57.608438 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.608108 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zvtqj/must-gather-48kgv"] Apr 16 16:54:57.712932 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.712897 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-must-gather-output\") pod \"must-gather-48kgv\" (UID: \"4a3d3efe-c973-4ce9-9cd0-fdcca3587975\") " pod="openshift-must-gather-zvtqj/must-gather-48kgv" Apr 16 16:54:57.713129 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.713016 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t9fc\" (UniqueName: \"kubernetes.io/projected/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-kube-api-access-5t9fc\") pod \"must-gather-48kgv\" (UID: \"4a3d3efe-c973-4ce9-9cd0-fdcca3587975\") " pod="openshift-must-gather-zvtqj/must-gather-48kgv" Apr 16 16:54:57.813852 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.813818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-must-gather-output\") pod \"must-gather-48kgv\" (UID: \"4a3d3efe-c973-4ce9-9cd0-fdcca3587975\") " pod="openshift-must-gather-zvtqj/must-gather-48kgv" Apr 16 16:54:57.813852 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.813859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5t9fc\" (UniqueName: \"kubernetes.io/projected/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-kube-api-access-5t9fc\") pod \"must-gather-48kgv\" (UID: \"4a3d3efe-c973-4ce9-9cd0-fdcca3587975\") " pod="openshift-must-gather-zvtqj/must-gather-48kgv" Apr 16 16:54:57.814207 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.814184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-must-gather-output\") pod \"must-gather-48kgv\" (UID: \"4a3d3efe-c973-4ce9-9cd0-fdcca3587975\") " pod="openshift-must-gather-zvtqj/must-gather-48kgv" Apr 16 16:54:57.822646 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.822620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t9fc\" (UniqueName: \"kubernetes.io/projected/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-kube-api-access-5t9fc\") pod \"must-gather-48kgv\" (UID: \"4a3d3efe-c973-4ce9-9cd0-fdcca3587975\") " pod="openshift-must-gather-zvtqj/must-gather-48kgv" Apr 16 16:54:57.910566 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:57.910532 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvtqj/must-gather-48kgv" Apr 16 16:54:58.034716 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:58.034681 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zvtqj/must-gather-48kgv"] Apr 16 16:54:58.037260 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:54:58.037230 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a3d3efe_c973_4ce9_9cd0_fdcca3587975.slice/crio-1c412c1b7c379b0056b6cbd44e5482de9bdac5d7e08bb863923b49077e190f3c WatchSource:0}: Error finding container 1c412c1b7c379b0056b6cbd44e5482de9bdac5d7e08bb863923b49077e190f3c: Status 404 returned error can't find the container with id 1c412c1b7c379b0056b6cbd44e5482de9bdac5d7e08bb863923b49077e190f3c Apr 16 16:54:58.038986 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:58.038969 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:54:58.530092 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:54:58.530057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvtqj/must-gather-48kgv" event={"ID":"4a3d3efe-c973-4ce9-9cd0-fdcca3587975","Type":"ContainerStarted","Data":"1c412c1b7c379b0056b6cbd44e5482de9bdac5d7e08bb863923b49077e190f3c"} Apr 16 16:55:02.548070 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:02.548021 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvtqj/must-gather-48kgv" event={"ID":"4a3d3efe-c973-4ce9-9cd0-fdcca3587975","Type":"ContainerStarted","Data":"29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c"} Apr 16 16:55:02.548499 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:02.548079 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvtqj/must-gather-48kgv" event={"ID":"4a3d3efe-c973-4ce9-9cd0-fdcca3587975","Type":"ContainerStarted","Data":"3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33"} Apr 16 16:55:02.565549 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:02.565500 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zvtqj/must-gather-48kgv" podStartSLOduration=1.5025385930000001 podStartE2EDuration="5.565487033s" podCreationTimestamp="2026-04-16 16:54:57 +0000 UTC" firstStartedPulling="2026-04-16 16:54:58.039092233 +0000 UTC m=+1927.003872707" lastFinishedPulling="2026-04-16 16:55:02.102040657 +0000 UTC m=+1931.066821147" observedRunningTime="2026-04-16 16:55:02.563992595 +0000 UTC m=+1931.528773092" watchObservedRunningTime="2026-04-16 16:55:02.565487033 +0000 UTC m=+1931.530267529" Apr 16 16:55:12.008045 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:12.008016 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:13.106520 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:13.106486 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:14.235885 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:14.235852 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:15.249723 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:15.249689 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:16.290621 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:16.290592 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:17.321345 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:17.321313 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:18.363894 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:18.363864 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:19.396181 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:19.396141 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:20.637739 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:20.637704 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:21.674531 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:21.674495 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:22.705606 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:22.705576 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:23.741104 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:23.741073 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:24.762590 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:24.762563 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:25.774368 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:25.774331 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-vfv42_aab3a47b-ea57-4dc4-ba01-6801db77b1e4/istio-proxy/0.log" Apr 16 16:55:26.896294 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:26.896263 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-fd2h8_981a5a65-c6e6-43dd-828e-1d3b5a580b24/istio-proxy/0.log" Apr 16 16:55:27.759956 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:27.759931 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-fd2h8_981a5a65-c6e6-43dd-828e-1d3b5a580b24/istio-proxy/0.log" Apr 16 16:55:28.595501 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:28.595472 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-wj5hp_afa14995-cc29-4885-8eda-eea6e807b984/manager/0.log" Apr 16 16:55:28.608153 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:28.608121 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-g7ll6_3efab7bb-d0d9-4b54-aec8-ae9d9d4408a5/manager/0.log" Apr 16 16:55:28.682361 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:28.682328 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-j55fq_7ba5ebd8-517f-4adb-9ee2-e934b8ef4864/limitador/0.log" Apr 16 16:55:29.652133 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:29.652046 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" containerID="3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33" exitCode=0 Apr 16 16:55:29.652133 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:29.652113 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvtqj/must-gather-48kgv" event={"ID":"4a3d3efe-c973-4ce9-9cd0-fdcca3587975","Type":"ContainerDied","Data":"3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33"} Apr 16 16:55:29.652589 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:29.652434 2577 scope.go:117] "RemoveContainer" containerID="3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33" Apr 16 16:55:30.368904 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:30.368880 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zvtqj_must-gather-48kgv_4a3d3efe-c973-4ce9-9cd0-fdcca3587975/gather/0.log" Apr 16 16:55:34.044635 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:34.044596 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-zh4s9_23450ee2-cf03-4966-b11a-bec44507a72d/global-pull-secret-syncer/0.log" Apr 16 16:55:34.181084 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:34.181051 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rd84q_753dfb74-b65d-4c0b-b6d1-a0907d0024bc/konnectivity-agent/0.log" Apr 16 16:55:34.206133 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:34.206105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-165.ec2.internal_cfff45f070cd3f24f31d63385bd46a42/haproxy/0.log" Apr 16 16:55:35.875279 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:35.875245 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zvtqj/must-gather-48kgv"] Apr 16 16:55:35.875708 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:35.875478 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-zvtqj/must-gather-48kgv" podUID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" containerName="copy" containerID="cri-o://29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c" gracePeriod=2 Apr 16 16:55:35.881798 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:35.881775 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zvtqj/must-gather-48kgv"] Apr 16 16:55:36.105397 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.105373 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zvtqj_must-gather-48kgv_4a3d3efe-c973-4ce9-9cd0-fdcca3587975/copy/0.log" Apr 16 16:55:36.105766 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.105749 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvtqj/must-gather-48kgv" Apr 16 16:55:36.107800 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.107779 2577 status_manager.go:895] "Failed to get status for pod" podUID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" pod="openshift-must-gather-zvtqj/must-gather-48kgv" err="pods \"must-gather-48kgv\" is forbidden: User \"system:node:ip-10-0-130-165.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-zvtqj\": no relationship found between node 'ip-10-0-130-165.ec2.internal' and this object" Apr 16 16:55:36.167754 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.167727 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-must-gather-output\") pod \"4a3d3efe-c973-4ce9-9cd0-fdcca3587975\" (UID: \"4a3d3efe-c973-4ce9-9cd0-fdcca3587975\") " Apr 16 16:55:36.167860 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.167817 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t9fc\" (UniqueName: \"kubernetes.io/projected/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-kube-api-access-5t9fc\") pod \"4a3d3efe-c973-4ce9-9cd0-fdcca3587975\" (UID: \"4a3d3efe-c973-4ce9-9cd0-fdcca3587975\") " Apr 16 16:55:36.170011 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.169989 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-kube-api-access-5t9fc" (OuterVolumeSpecName: "kube-api-access-5t9fc") pod "4a3d3efe-c973-4ce9-9cd0-fdcca3587975" (UID: "4a3d3efe-c973-4ce9-9cd0-fdcca3587975"). InnerVolumeSpecName "kube-api-access-5t9fc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:55:36.175954 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.175926 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4a3d3efe-c973-4ce9-9cd0-fdcca3587975" (UID: "4a3d3efe-c973-4ce9-9cd0-fdcca3587975"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:55:36.268911 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.268876 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5t9fc\" (UniqueName: \"kubernetes.io/projected/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-kube-api-access-5t9fc\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:55:36.268911 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.268907 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a3d3efe-c973-4ce9-9cd0-fdcca3587975-must-gather-output\") on node \"ip-10-0-130-165.ec2.internal\" DevicePath \"\"" Apr 16 16:55:36.677218 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.677192 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zvtqj_must-gather-48kgv_4a3d3efe-c973-4ce9-9cd0-fdcca3587975/copy/0.log" Apr 16 16:55:36.677566 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.677540 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" containerID="29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c" exitCode=143 Apr 16 16:55:36.677654 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.677598 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvtqj/must-gather-48kgv" Apr 16 16:55:36.677654 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.677622 2577 scope.go:117] "RemoveContainer" containerID="29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c" Apr 16 16:55:36.679827 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.679794 2577 status_manager.go:895] "Failed to get status for pod" podUID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" pod="openshift-must-gather-zvtqj/must-gather-48kgv" err="pods \"must-gather-48kgv\" is forbidden: User \"system:node:ip-10-0-130-165.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-zvtqj\": no relationship found between node 'ip-10-0-130-165.ec2.internal' and this object" Apr 16 16:55:36.686384 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.686217 2577 scope.go:117] "RemoveContainer" containerID="3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33" Apr 16 16:55:36.688626 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.688596 2577 status_manager.go:895] "Failed to get status for pod" podUID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" pod="openshift-must-gather-zvtqj/must-gather-48kgv" err="pods \"must-gather-48kgv\" is forbidden: User \"system:node:ip-10-0-130-165.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-zvtqj\": no relationship found between node 'ip-10-0-130-165.ec2.internal' and this object" Apr 16 16:55:36.699296 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.699280 2577 scope.go:117] "RemoveContainer" containerID="29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c" Apr 16 16:55:36.699625 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:55:36.699603 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c\": container with ID starting with 29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c not found: ID does not exist" containerID="29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c" Apr 16 16:55:36.699712 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.699633 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c"} err="failed to get container status \"29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c\": rpc error: code = NotFound desc = could not find container \"29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c\": container with ID starting with 29287a33e0d44befa221846fa2b98ac0fe959059d4d3300f85c22c95fe1d897c not found: ID does not exist" Apr 16 16:55:36.699712 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.699658 2577 scope.go:117] "RemoveContainer" containerID="3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33" Apr 16 16:55:36.699927 ip-10-0-130-165 kubenswrapper[2577]: E0416 16:55:36.699911 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33\": container with ID starting with 3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33 not found: ID does not exist" containerID="3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33" Apr 16 16:55:36.699965 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:36.699936 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33"} err="failed to get container status \"3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33\": rpc error: code = NotFound desc = could not find container \"3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33\": container with ID starting with 3b46a83b7f1d31668624d1a3a343ae48de3663fe0b23d9e8bb756a1c401b2d33 not found: ID does not exist" Apr 16 16:55:37.651083 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:37.651052 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" path="/var/lib/kubelet/pods/4a3d3efe-c973-4ce9-9cd0-fdcca3587975/volumes" Apr 16 16:55:37.960349 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:37.960322 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-wj5hp_afa14995-cc29-4885-8eda-eea6e807b984/manager/0.log" Apr 16 16:55:37.988498 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:37.988477 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-g7ll6_3efab7bb-d0d9-4b54-aec8-ae9d9d4408a5/manager/0.log" Apr 16 16:55:38.105117 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:38.105094 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-j55fq_7ba5ebd8-517f-4adb-9ee2-e934b8ef4864/limitador/0.log" Apr 16 16:55:39.515978 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:39.515865 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-sx2kr_c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb/kube-state-metrics/0.log" Apr 16 16:55:39.541267 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:39.541228 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-sx2kr_c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb/kube-rbac-proxy-main/0.log" Apr 16 16:55:39.569165 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:39.569119 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-sx2kr_c63e3e8b-c729-4ce7-af91-e9f6ee85dbdb/kube-rbac-proxy-self/0.log" Apr 16 16:55:39.667767 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:39.667739 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5x8g6_d8bf96d8-da41-4d71-80d1-f04a83e90145/node-exporter/0.log" Apr 16 16:55:39.689874 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:39.689837 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5x8g6_d8bf96d8-da41-4d71-80d1-f04a83e90145/kube-rbac-proxy/0.log" Apr 16 16:55:39.712190 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:39.712165 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5x8g6_d8bf96d8-da41-4d71-80d1-f04a83e90145/init-textfile/0.log" Apr 16 16:55:39.893851 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:39.893749 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4ch4v_71243495-3c08-450a-b8d9-dce03ef8be95/kube-rbac-proxy-main/0.log" Apr 16 16:55:39.919071 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:39.919041 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4ch4v_71243495-3c08-450a-b8d9-dce03ef8be95/kube-rbac-proxy-self/0.log" Apr 16 16:55:39.944337 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:39.944307 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4ch4v_71243495-3c08-450a-b8d9-dce03ef8be95/openshift-state-metrics/0.log" Apr 16 16:55:39.994628 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:39.994588 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_462c2651-1d30-4908-9371-dc7b66a64e53/prometheus/0.log" Apr 16 16:55:40.028134 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:40.028109 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_462c2651-1d30-4908-9371-dc7b66a64e53/config-reloader/0.log" Apr 16 16:55:40.055993 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:40.055953 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_462c2651-1d30-4908-9371-dc7b66a64e53/thanos-sidecar/0.log" Apr 16 16:55:40.081327 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:40.081298 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_462c2651-1d30-4908-9371-dc7b66a64e53/kube-rbac-proxy-web/0.log" Apr 16 16:55:40.105848 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:40.105826 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_462c2651-1d30-4908-9371-dc7b66a64e53/kube-rbac-proxy/0.log" Apr 16 16:55:40.131844 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:40.131820 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_462c2651-1d30-4908-9371-dc7b66a64e53/kube-rbac-proxy-thanos/0.log" Apr 16 16:55:40.182927 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:40.182900 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_462c2651-1d30-4908-9371-dc7b66a64e53/init-config-reloader/0.log" Apr 16 16:55:40.215625 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:40.215600 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-849wn_b09cd66f-8167-4706-bf22-d8813a45efde/prometheus-operator/0.log" Apr 16 16:55:40.247174 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:40.247147 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-849wn_b09cd66f-8167-4706-bf22-d8813a45efde/kube-rbac-proxy/0.log" Apr 16 16:55:40.281808 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:40.281781 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-c4dfj_90402a41-1582-462e-a0fc-4ffd6b779e4b/prometheus-operator-admission-webhook/0.log" Apr 16 16:55:42.645259 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.645226 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx"] Apr 16 16:55:42.645655 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.645585 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" containerName="copy" Apr 16 16:55:42.645655 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.645597 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" containerName="copy" Apr 16 16:55:42.645655 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.645612 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" containerName="gather" Apr 16 16:55:42.645655 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.645618 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" containerName="gather" Apr 16 16:55:42.645793 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.645668 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" containerName="gather" Apr 16 16:55:42.645793 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.645676 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a3d3efe-c973-4ce9-9cd0-fdcca3587975" containerName="copy" Apr 16 16:55:42.652293 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.652268 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.654971 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.654952 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pfzkj\"/\"openshift-service-ca.crt\"" Apr 16 16:55:42.655578 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.655559 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pfzkj\"/\"default-dockercfg-q75nm\"" Apr 16 16:55:42.656225 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.656209 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pfzkj\"/\"kube-root-ca.crt\"" Apr 16 16:55:42.661320 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.661294 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx"] Apr 16 16:55:42.732551 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.732516 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-proc\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.732744 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.732579 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-lib-modules\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.732744 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.732729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-sys\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.732837 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.732781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8vg4\" (UniqueName: \"kubernetes.io/projected/b185aee5-4480-4cd2-b86a-9e4711492546-kube-api-access-t8vg4\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.732837 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.732823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-podres\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.834266 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.834232 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-sys\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.834494 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.834284 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8vg4\" (UniqueName: \"kubernetes.io/projected/b185aee5-4480-4cd2-b86a-9e4711492546-kube-api-access-t8vg4\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.834494 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.834316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-podres\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.834494 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.834361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-sys\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.834494 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.834366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-proc\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.834494 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.834417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-proc\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.834494 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.834478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-lib-modules\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.834736 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.834527 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-podres\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.834736 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.834604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b185aee5-4480-4cd2-b86a-9e4711492546-lib-modules\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.843842 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.843819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8vg4\" (UniqueName: \"kubernetes.io/projected/b185aee5-4480-4cd2-b86a-9e4711492546-kube-api-access-t8vg4\") pod \"perf-node-gather-daemonset-tcqbx\" (UID: \"b185aee5-4480-4cd2-b86a-9e4711492546\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:42.962910 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:42.962872 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:43.100212 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:43.100183 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx"] Apr 16 16:55:43.102163 ip-10-0-130-165 kubenswrapper[2577]: W0416 16:55:43.102130 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb185aee5_4480_4cd2_b86a_9e4711492546.slice/crio-694d673de9ec39f66f99628c5788913d3c81c5ca54c9e589debe526d7de3d572 WatchSource:0}: Error finding container 694d673de9ec39f66f99628c5788913d3c81c5ca54c9e589debe526d7de3d572: Status 404 returned error can't find the container with id 694d673de9ec39f66f99628c5788913d3c81c5ca54c9e589debe526d7de3d572 Apr 16 16:55:43.705754 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:43.705712 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" event={"ID":"b185aee5-4480-4cd2-b86a-9e4711492546","Type":"ContainerStarted","Data":"73a7e8274293b71d5c99a3b485eef113537cdad947b9db5137fbbf6e8ca2bd78"} Apr 16 16:55:43.705754 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:43.705749 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" event={"ID":"b185aee5-4480-4cd2-b86a-9e4711492546","Type":"ContainerStarted","Data":"694d673de9ec39f66f99628c5788913d3c81c5ca54c9e589debe526d7de3d572"} Apr 16 16:55:43.706155 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:43.705847 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:43.731319 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:43.731271 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" podStartSLOduration=1.731255644 podStartE2EDuration="1.731255644s" podCreationTimestamp="2026-04-16 16:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:55:43.729576987 +0000 UTC m=+1972.694357484" watchObservedRunningTime="2026-04-16 16:55:43.731255644 +0000 UTC m=+1972.696036196" Apr 16 16:55:44.453608 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:44.453583 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pfv5k_8016a568-6fe9-4dfc-a543-f50b2768e5b2/dns/0.log" Apr 16 16:55:44.516807 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:44.516781 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pfv5k_8016a568-6fe9-4dfc-a543-f50b2768e5b2/kube-rbac-proxy/0.log" Apr 16 16:55:44.545153 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:44.545124 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7h5k5_937105e9-6cc7-458f-9b5c-007250aa5a6c/dns-node-resolver/0.log" Apr 16 16:55:45.095641 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:45.095609 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7c6b454dd-p6wm8_e87aa010-2a1b-4e10-a6a1-5a99c9830e6f/registry/0.log" Apr 16 16:55:45.166707 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:45.166680 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gq7bg_b5a35ec4-25f4-4c5b-8175-23e377d3e9b3/node-ca/0.log" Apr 16 16:55:46.104352 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:46.104323 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-fd2h8_981a5a65-c6e6-43dd-828e-1d3b5a580b24/istio-proxy/0.log" Apr 16 16:55:46.734476 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:46.734434 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-z5t69_461b689e-a41b-4182-ba52-e26a1dfbc007/serve-healthcheck-canary/0.log" Apr 16 16:55:47.633509 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:47.633465 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wfnkm_d8cd52ef-667c-4000-b683-c3c39c1df67e/kube-rbac-proxy/0.log" Apr 16 16:55:47.669139 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:47.669111 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wfnkm_d8cd52ef-667c-4000-b683-c3c39c1df67e/exporter/0.log" Apr 16 16:55:47.712520 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:47.712478 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wfnkm_d8cd52ef-667c-4000-b683-c3c39c1df67e/extractor/0.log" Apr 16 16:55:49.718257 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:49.718232 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-tcqbx" Apr 16 16:55:50.468192 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:50.468164 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-846585b969-kxngw_38a2106e-c979-4ffa-8381-3b151f24acd7/manager/0.log" Apr 16 16:55:57.223522 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:57.223486 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-txg78_228b5774-6748-4592-bb81-0b7f69e4dcc8/kube-storage-version-migrator-operator/1.log" Apr 16 16:55:57.224392 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:57.224375 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-txg78_228b5774-6748-4592-bb81-0b7f69e4dcc8/kube-storage-version-migrator-operator/0.log" Apr 16 16:55:58.425835 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:58.425810 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b587s_71da194f-358e-449e-9a55-2882465c41ef/kube-multus-additional-cni-plugins/0.log" Apr 16 16:55:58.457608 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:58.457581 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b587s_71da194f-358e-449e-9a55-2882465c41ef/egress-router-binary-copy/0.log" Apr 16 16:55:58.484773 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:58.484741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b587s_71da194f-358e-449e-9a55-2882465c41ef/cni-plugins/0.log" Apr 16 16:55:58.511908 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:58.511878 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b587s_71da194f-358e-449e-9a55-2882465c41ef/bond-cni-plugin/0.log" Apr 16 16:55:58.542987 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:58.542959 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b587s_71da194f-358e-449e-9a55-2882465c41ef/routeoverride-cni/0.log" Apr 16 16:55:58.568244 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:58.568218 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b587s_71da194f-358e-449e-9a55-2882465c41ef/whereabouts-cni-bincopy/0.log" Apr 16 16:55:58.601291 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:58.601261 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b587s_71da194f-358e-449e-9a55-2882465c41ef/whereabouts-cni/0.log" Apr 16 16:55:59.113255 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:59.113220 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p6shp_3c6f4643-0f15-43f3-b51e-e048015bf431/kube-multus/0.log" Apr 16 16:55:59.348692 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:59.348664 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sdrp4_858151a3-bcef-4b9a-94c3-32bd1f0db177/network-metrics-daemon/0.log" Apr 16 16:55:59.416350 ip-10-0-130-165 kubenswrapper[2577]: I0416 16:55:59.416323 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sdrp4_858151a3-bcef-4b9a-94c3-32bd1f0db177/kube-rbac-proxy/0.log"