Apr 21 15:32:47.695938 ip-10-0-128-232 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 15:32:47.695952 ip-10-0-128-232 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 15:32:47.695963 ip-10-0-128-232 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 15:32:47.696183 ip-10-0-128-232 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 15:32:57.885850 ip-10-0-128-232 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 15:32:57.885867 ip-10-0-128-232 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b648909d8fc34475a26bdd6dc42c8fa8 -- Apr 21 15:35:24.306792 ip-10-0-128-232 systemd[1]: Starting Kubernetes Kubelet... Apr 21 15:35:24.783988 ip-10-0-128-232 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:24.783988 ip-10-0-128-232 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 15:35:24.783988 ip-10-0-128-232 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:24.783988 ip-10-0-128-232 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 15:35:24.783988 ip-10-0-128-232 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:24.785765 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.785668 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 15:35:24.793779 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793752 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:24.793779 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793773 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:24.793779 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793778 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:24.793779 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793781 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:24.793779 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793784 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:24.793779 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793788 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793791 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793798 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793801 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793805 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793809 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793812 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793817 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793821 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793825 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793828 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793830 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793834 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793837 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793843 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793845 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793848 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793851 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:24.794003 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793855 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793859 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793862 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793865 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793868 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793870 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793873 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793876 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793878 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793885 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793887 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793890 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793893 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793896 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793899 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793902 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793905 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793909 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793912 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793915 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:24.794450 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793920 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793923 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793925 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793928 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793931 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793933 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793936 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793939 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793942 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793944 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793947 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793949 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793952 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793957 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793960 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793962 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793965 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793968 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793970 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793973 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:24.794984 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793975 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793978 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793981 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793985 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793988 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793993 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793995 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.793998 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794002 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794005 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794009 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794012 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794015 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794017 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794020 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794023 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794026 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794028 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794033 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794036 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:24.795469 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794039 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794045 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794048 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794711 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794718 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794721 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794724 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794727 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794733 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794736 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794739 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794742 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794745 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794748 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794751 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794754 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794757 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794760 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794762 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794766 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794772 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:24.795961 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794775 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794778 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794781 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794784 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794787 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794789 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794792 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794795 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794798 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794800 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794803 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794806 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794811 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794814 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794817 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794820 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794822 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794826 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794829 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794831 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:24.796459 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794834 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794836 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794840 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794842 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794847 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794850 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794853 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794856 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794858 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794861 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794864 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794867 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794869 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794872 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794876 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794879 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794882 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794887 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794890 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794893 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:24.796960 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794896 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794899 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794902 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794905 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794907 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794910 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794914 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794917 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794919 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794924 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794927 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794929 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794932 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794935 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794937 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794940 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794943 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794946 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794949 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:24.797474 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794951 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794954 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794957 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794962 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794965 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794969 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794973 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794977 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.794981 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796690 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796717 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796726 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796732 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796737 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796741 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796746 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796754 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796757 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796761 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796764 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796768 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 15:35:24.797957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796771 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796774 2569 flags.go:64] FLAG: --cgroup-root="" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796777 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796780 2569 flags.go:64] FLAG: --client-ca-file="" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796783 2569 flags.go:64] FLAG: --cloud-config="" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796786 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796790 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796795 2569 flags.go:64] FLAG: --cluster-domain="" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796798 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796801 2569 flags.go:64] FLAG: --config-dir="" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796804 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796808 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796816 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796820 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796823 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796826 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796830 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796834 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796837 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796840 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796844 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796849 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796853 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796856 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796859 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 15:35:24.798578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796862 2569 flags.go:64] FLAG: --enable-server="true" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796865 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796870 2569 flags.go:64] FLAG: --event-burst="100" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796874 2569 flags.go:64] FLAG: --event-qps="50" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796877 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796880 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796883 2569 flags.go:64] FLAG: --eviction-hard="" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796887 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796890 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796894 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796897 2569 flags.go:64] FLAG: --eviction-soft="" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796899 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796902 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796905 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796908 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796912 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796915 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796918 2569 flags.go:64] FLAG: --feature-gates="" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796922 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796925 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796929 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796932 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796935 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796938 2569 flags.go:64] FLAG: --help="false" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796941 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-128-232.ec2.internal" Apr 21 15:35:24.799182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796944 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796947 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796950 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796955 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796960 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796963 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796966 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796969 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796972 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796975 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796978 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796981 2569 flags.go:64] FLAG: --kube-reserved="" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796984 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796987 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796990 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796992 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796995 2569 flags.go:64] FLAG: --lock-file="" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.796998 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797001 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797004 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797009 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797012 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797016 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 15:35:24.799798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797019 2569 flags.go:64] FLAG: --logging-format="text" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797022 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797025 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797028 2569 flags.go:64] FLAG: --manifest-url="" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797031 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797036 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797039 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797043 2569 flags.go:64] FLAG: --max-pods="110" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797046 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797049 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797052 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797055 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797059 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797062 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797065 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797073 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797076 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797080 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797085 2569 flags.go:64] FLAG: --pod-cidr="" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797088 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797093 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797096 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797100 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797102 2569 flags.go:64] FLAG: --port="10250" Apr 21 15:35:24.800363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797106 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797108 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0894434791b4d8831" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797112 2569 flags.go:64] FLAG: --qos-reserved="" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797115 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797118 2569 flags.go:64] FLAG: --register-node="true" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797121 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797123 2569 flags.go:64] FLAG: --register-with-taints="" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797127 2569 flags.go:64] FLAG: --registry-burst="10" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797130 2569 flags.go:64] FLAG: --registry-qps="5" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797133 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797136 2569 flags.go:64] FLAG: --reserved-memory="" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797140 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797143 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797146 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797149 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797152 2569 flags.go:64] FLAG: --runonce="false" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797155 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797158 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797161 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797165 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797168 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797171 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797174 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797177 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797180 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797183 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 15:35:24.800961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797186 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797190 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797193 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797196 2569 flags.go:64] FLAG: --system-cgroups="" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797199 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797206 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797209 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797211 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797217 2569 flags.go:64] FLAG: --tls-min-version="" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797219 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797222 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797225 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797228 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797232 2569 flags.go:64] FLAG: --v="2" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797236 2569 flags.go:64] FLAG: --version="false" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797240 2569 flags.go:64] FLAG: --vmodule="" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797245 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.797248 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797353 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797357 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797360 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797363 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797367 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:24.801609 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797370 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797373 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797376 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797378 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797381 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797384 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797387 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797389 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797392 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797394 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797397 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797400 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797403 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797405 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797408 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797410 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797413 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797415 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797418 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797420 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:24.802165 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797423 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797425 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797429 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797433 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797435 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797438 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797441 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797446 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797449 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797452 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797455 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797458 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797460 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797463 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797466 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797473 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797493 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797498 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797503 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797508 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:24.802716 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797511 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797514 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797517 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797520 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797523 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797526 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797529 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797532 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797535 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797537 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797540 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797543 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797546 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797548 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797551 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797554 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797556 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797559 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797562 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:24.803227 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797567 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797571 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797574 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797576 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797579 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797582 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797584 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797587 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797590 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797593 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797596 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797598 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797601 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797604 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797606 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797609 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797611 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797614 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797617 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797620 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:24.803816 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797623 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:24.804322 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.797626 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:24.804322 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.798253 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:24.805056 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.805036 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 15:35:24.805094 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.805057 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 15:35:24.805129 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805108 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:24.805129 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805114 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:24.805129 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805117 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:24.805129 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805120 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:24.805129 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805123 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:24.805129 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805127 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:24.805129 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805130 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805134 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805137 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805140 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805142 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805145 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805148 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805150 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805153 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805156 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805160 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805164 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805167 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805170 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805173 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805176 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805179 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805181 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805184 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:24.805310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805187 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805190 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805193 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805195 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805198 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805201 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805205 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805208 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805210 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805214 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805217 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805219 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805222 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805225 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805228 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805231 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805235 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805238 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805241 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805243 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:24.805793 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805246 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805248 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805251 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805254 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805256 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805259 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805262 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805265 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805267 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805270 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805272 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805275 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805278 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805280 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805283 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805286 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805288 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805291 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805293 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805297 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:24.806288 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805299 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805302 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805305 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805308 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805310 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805313 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805317 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805320 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805323 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805326 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805328 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805331 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805334 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805337 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805339 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805342 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805346 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805350 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805352 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:24.806805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805355 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805357 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.805363 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805471 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805491 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805494 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805498 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805502 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805505 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805508 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805510 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805513 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805516 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805524 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805528 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:24.807269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805530 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805533 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805536 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805540 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805543 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805547 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805551 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805554 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805557 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805559 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805562 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805566 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805568 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805572 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805574 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805578 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805581 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805583 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805586 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:24.807768 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805589 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805592 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805594 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805597 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805599 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805602 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805604 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805607 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805609 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805612 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805614 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805616 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805620 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805623 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805625 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805628 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805630 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805633 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805636 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:24.808225 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805639 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805642 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805644 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805647 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805649 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805652 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805654 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805657 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805660 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805662 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805665 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805668 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805670 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805672 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805675 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805678 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805680 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805683 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805685 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805688 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:24.808707 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805690 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805693 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805695 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805698 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805701 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805703 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805706 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805709 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805711 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805714 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805716 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805719 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805722 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805725 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805727 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:24.809182 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:24.805730 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:24.809566 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.805735 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:24.809566 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.806884 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 15:35:24.810761 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.810747 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 15:35:24.811743 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.811731 2569 server.go:1019] "Starting client certificate rotation" Apr 21 15:35:24.811845 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.811831 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:24.811881 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.811864 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:24.839918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.839892 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:24.845843 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.845826 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:24.866759 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.866735 2569 log.go:25] "Validated CRI v1 runtime API" Apr 21 15:35:24.872550 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.872533 2569 log.go:25] "Validated CRI v1 image API" Apr 21 15:35:24.873741 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.873720 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 15:35:24.873856 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.873836 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:24.877264 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.877241 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7ab74a9e-3b6f-4923-87f8-5b8494c9d008:/dev/nvme0n1p3 d4ccd199-54e3-425b-bdcd-e6870d4c0e00:/dev/nvme0n1p4] Apr 21 15:35:24.877323 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.877263 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 15:35:24.882436 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.882314 2569 manager.go:217] Machine: {Timestamp:2026-04-21 15:35:24.88099429 +0000 UTC m=+0.453027225 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102766 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a05f2663772c19cf98bc9fae90202 SystemUUID:ec2a05f2-6637-72c1-9cf9-8bc9fae90202 BootID:b648909d-8fc3-4475-a26b-dd6dc42c8fa8 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c6:27:88:86:07 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c6:27:88:86:07 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:ea:4a:ff:a8:00 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 15:35:24.882436 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.882434 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 15:35:24.882555 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.882529 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 15:35:24.884460 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.884439 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 15:35:24.884622 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.884463 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-232.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 15:35:24.884665 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.884635 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 15:35:24.884665 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.884644 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 15:35:24.884665 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.884657 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:24.885508 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.885497 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:24.886255 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.886244 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:24.886368 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.886359 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 15:35:24.888705 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.888696 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 21 15:35:24.888748 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.888712 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 15:35:24.888748 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.888724 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 15:35:24.888748 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.888733 2569 kubelet.go:397] "Adding apiserver pod source" Apr 21 15:35:24.888748 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.888742 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 15:35:24.889793 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.889782 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:24.889844 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.889803 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:24.893039 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.893020 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 15:35:24.894359 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.894346 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 15:35:24.896091 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896079 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 15:35:24.896129 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896107 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 15:35:24.896129 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896113 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 15:35:24.896129 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896119 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 15:35:24.896129 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896124 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 15:35:24.896237 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896135 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 15:35:24.896237 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896142 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 15:35:24.896237 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896147 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 15:35:24.896237 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896154 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 15:35:24.896237 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896159 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 15:35:24.896237 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896171 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 15:35:24.896237 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.896180 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 15:35:24.897209 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.897197 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 15:35:24.897209 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.897209 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 15:35:24.899110 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.899085 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vrz47" Apr 21 15:35:24.900924 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.900907 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 15:35:24.900982 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.900973 2569 server.go:1295] "Started kubelet" Apr 21 15:35:24.901105 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.901054 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 15:35:24.901216 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.901118 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 15:35:24.901216 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.901206 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 15:35:24.901736 ip-10-0-128-232 systemd[1]: Started Kubernetes Kubelet. Apr 21 15:35:24.902631 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.902566 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-232.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 15:35:24.902631 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.902598 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 15:35:24.903292 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:24.903269 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-232.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 15:35:24.903292 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:24.903271 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 15:35:24.908518 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.908498 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 21 15:35:24.909099 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.909079 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vrz47" Apr 21 15:35:24.911696 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:24.910655 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-232.ec2.internal.18a86933f6b490cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-232.ec2.internal,UID:ip-10-0-128-232.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-232.ec2.internal,},FirstTimestamp:2026-04-21 15:35:24.900937933 +0000 UTC m=+0.472970867,LastTimestamp:2026-04-21 15:35:24.900937933 +0000 UTC m=+0.472970867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-232.ec2.internal,}" Apr 21 15:35:24.912606 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:24.912576 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 15:35:24.912606 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.912588 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:24.913150 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.913136 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 15:35:24.913810 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.913792 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 15:35:24.913810 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.913793 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 15:35:24.913951 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.913823 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 15:35:24.913951 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.913850 2569 factory.go:55] Registering systemd factory Apr 21 15:35:24.913951 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.913865 2569 factory.go:223] Registration of the systemd container factory successfully Apr 21 15:35:24.913951 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.913929 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 21 15:35:24.913951 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.913938 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 21 15:35:24.914157 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.914078 2569 factory.go:153] Registering CRI-O factory Apr 21 15:35:24.914157 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.914089 2569 factory.go:223] Registration of the crio container factory successfully Apr 21 15:35:24.914157 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.914139 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 15:35:24.914157 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.914154 2569 factory.go:103] Registering Raw factory Apr 21 15:35:24.914351 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.914163 2569 manager.go:1196] Started watching for new ooms in manager Apr 21 15:35:24.914437 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:24.914419 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:24.915124 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.914726 2569 manager.go:319] Starting recovery of all containers Apr 21 15:35:24.925106 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.924941 2569 manager.go:324] Recovery completed Apr 21 15:35:24.927637 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.927620 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:24.929889 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.929878 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:24.932103 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.932085 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:24.932163 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.932111 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:24.932163 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.932122 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:24.932637 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.932623 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 15:35:24.932637 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.932634 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 15:35:24.932761 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.932650 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:24.934678 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.934667 2569 policy_none.go:49] "None policy: Start" Apr 21 15:35:24.934724 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.934682 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 15:35:24.934724 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.934692 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 21 15:35:24.935338 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:24.935321 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-232.ec2.internal\" not found" node="ip-10-0-128-232.ec2.internal" Apr 21 15:35:24.967115 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.967096 2569 manager.go:341] "Starting Device Plugin manager" Apr 21 15:35:24.967236 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:24.967134 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 15:35:24.967236 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.967148 2569 server.go:85] "Starting device plugin registration server" Apr 21 15:35:24.967749 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.967694 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 15:35:24.967826 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.967770 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 15:35:24.968237 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.968001 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 15:35:24.968237 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.968084 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 15:35:24.968237 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:24.968093 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 15:35:24.968937 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:24.968918 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 15:35:24.969080 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:24.969069 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:25.048053 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.047966 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 15:35:25.049241 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.049226 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 15:35:25.049327 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.049256 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 15:35:25.049327 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.049281 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 15:35:25.049327 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.049291 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 15:35:25.049457 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.049335 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 15:35:25.052509 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.052490 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:25.068473 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.068447 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:25.069310 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.069295 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:25.069378 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.069325 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:25.069378 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.069339 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:25.069378 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.069366 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.077385 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.077370 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.077453 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.077395 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-232.ec2.internal\": node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:25.093145 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.093115 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:25.149712 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.149681 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-232.ec2.internal"] Apr 21 15:35:25.149783 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.149776 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:25.151204 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.151190 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:25.151272 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.151217 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:25.151272 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.151227 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:25.152524 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.152512 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:25.152654 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.152640 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.152712 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.152666 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:25.153202 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.153188 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:25.153260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.153192 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:25.153260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.153228 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:25.153260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.153238 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:25.153346 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.153209 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:25.153346 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.153282 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:25.154427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.154408 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.154540 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.154431 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:25.155085 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.155070 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:25.155157 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.155093 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:25.155157 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.155103 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:25.173034 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.173017 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-232.ec2.internal\" not found" node="ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.176352 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.176337 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-232.ec2.internal\" not found" node="ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.194008 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.193987 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:25.294962 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.294926 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:25.315082 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.314995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0a0b3ad87f77adee4d1e995d037c7e36-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal\" (UID: \"0a0b3ad87f77adee4d1e995d037c7e36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.315082 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.315026 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a0b3ad87f77adee4d1e995d037c7e36-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal\" (UID: \"0a0b3ad87f77adee4d1e995d037c7e36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.315082 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.315045 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fca69f1fb3124857e02406d9db421c8d-config\") pod \"kube-apiserver-proxy-ip-10-0-128-232.ec2.internal\" (UID: \"fca69f1fb3124857e02406d9db421c8d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.395893 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.395865 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:25.416200 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.416176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0a0b3ad87f77adee4d1e995d037c7e36-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal\" (UID: \"0a0b3ad87f77adee4d1e995d037c7e36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.416264 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.416206 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a0b3ad87f77adee4d1e995d037c7e36-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal\" (UID: \"0a0b3ad87f77adee4d1e995d037c7e36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.416264 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.416231 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fca69f1fb3124857e02406d9db421c8d-config\") pod \"kube-apiserver-proxy-ip-10-0-128-232.ec2.internal\" (UID: \"fca69f1fb3124857e02406d9db421c8d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.416357 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.416336 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0a0b3ad87f77adee4d1e995d037c7e36-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal\" (UID: \"0a0b3ad87f77adee4d1e995d037c7e36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.416405 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.416372 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fca69f1fb3124857e02406d9db421c8d-config\") pod \"kube-apiserver-proxy-ip-10-0-128-232.ec2.internal\" (UID: \"fca69f1fb3124857e02406d9db421c8d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.416405 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.416350 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a0b3ad87f77adee4d1e995d037c7e36-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal\" (UID: \"0a0b3ad87f77adee4d1e995d037c7e36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.475362 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.475322 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.478867 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.478845 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-232.ec2.internal" Apr 21 15:35:25.496399 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.496375 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:25.597028 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.596937 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:25.697491 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.697440 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:25.798176 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.798143 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:25.811319 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.811291 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 15:35:25.811492 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.811455 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:35:25.811492 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.811455 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:35:25.899267 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.899188 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:25.910964 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.910924 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:30:24 +0000 UTC" deadline="2027-12-24 18:14:39.967903931 +0000 UTC" Apr 21 15:35:25.910964 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.910961 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14690h39m14.056945988s" Apr 21 15:35:25.913224 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.913207 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:25.925230 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.925203 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:25.947102 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.947081 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mshnk" Apr 21 15:35:25.958881 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:25.958855 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mshnk" Apr 21 15:35:25.999347 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:25.999315 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:26.033023 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:26.032990 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a0b3ad87f77adee4d1e995d037c7e36.slice/crio-7f12fa4a2a3a70a81e7559a5c554006bb3ab7727855e5ccbd583844f923373d3 WatchSource:0}: Error finding container 7f12fa4a2a3a70a81e7559a5c554006bb3ab7727855e5ccbd583844f923373d3: Status 404 returned error can't find the container with id 7f12fa4a2a3a70a81e7559a5c554006bb3ab7727855e5ccbd583844f923373d3 Apr 21 15:35:26.033350 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:26.033330 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca69f1fb3124857e02406d9db421c8d.slice/crio-f865b4f1f2dc31120751664074d322b58b0958d546d178c62504ca8e794e2ba3 WatchSource:0}: Error finding container f865b4f1f2dc31120751664074d322b58b0958d546d178c62504ca8e794e2ba3: Status 404 returned error can't find the container with id f865b4f1f2dc31120751664074d322b58b0958d546d178c62504ca8e794e2ba3 Apr 21 15:35:26.037443 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.037427 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:35:26.052959 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.052908 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-232.ec2.internal" event={"ID":"fca69f1fb3124857e02406d9db421c8d","Type":"ContainerStarted","Data":"f865b4f1f2dc31120751664074d322b58b0958d546d178c62504ca8e794e2ba3"} Apr 21 15:35:26.053914 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.053893 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" event={"ID":"0a0b3ad87f77adee4d1e995d037c7e36","Type":"ContainerStarted","Data":"7f12fa4a2a3a70a81e7559a5c554006bb3ab7727855e5ccbd583844f923373d3"} Apr 21 15:35:26.100094 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:26.100059 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-232.ec2.internal\" not found" Apr 21 15:35:26.185463 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.185437 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:26.191786 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.191765 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:26.213932 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.213904 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" Apr 21 15:35:26.225802 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.225777 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:26.226697 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.226684 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-232.ec2.internal" Apr 21 15:35:26.233522 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.233507 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:26.889574 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.889539 2569 apiserver.go:52] "Watching apiserver" Apr 21 15:35:26.896450 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.896415 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 15:35:26.897767 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.897734 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-lp64c","openshift-network-diagnostics/network-check-target-ct78s","openshift-network-operator/iptables-alerter-84tg6","openshift-ovn-kubernetes/ovnkube-node-wkxqg","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg","openshift-cluster-node-tuning-operator/tuned-m6tzn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal","openshift-multus/multus-additional-cni-plugins-xwpr7","openshift-multus/multus-cx47v","kube-system/konnectivity-agent-6l9p8","kube-system/kube-apiserver-proxy-ip-10-0-128-232.ec2.internal","openshift-dns/node-resolver-9b9sq","openshift-image-registry/node-ca-jddzn"] Apr 21 15:35:26.900671 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.900353 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:26.900671 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.900570 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:26.901013 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:26.900955 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:26.903205 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.903180 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-84tg6" Apr 21 15:35:26.903332 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.903169 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-v64vk\"" Apr 21 15:35:26.904411 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.904391 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 15:35:26.905345 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.904780 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 15:35:26.905345 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.904806 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.905345 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.904882 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:26.906101 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.905881 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:26.906101 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.905904 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8q28v\"" Apr 21 15:35:26.906534 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.906506 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.907205 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.907190 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:26.907496 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.907468 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 15:35:26.907673 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.907653 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:26.908886 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.908816 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 15:35:26.908886 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.908824 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909045 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909079 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909195 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909330 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909346 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sp8fc\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909397 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5qjrv\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909430 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909558 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909627 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mtn8s\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909658 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909677 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 15:35:26.909922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.909757 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 15:35:26.910529 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.910158 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.910529 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.910278 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:26.910529 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:26.910333 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:26.910676 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.910562 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 15:35:26.910676 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.910627 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 15:35:26.910733 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.910702 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 15:35:26.910833 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.910819 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-j7cc2\"" Apr 21 15:35:26.910948 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.910933 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 15:35:26.911065 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.911046 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 15:35:26.912571 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.912546 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9b9sq" Apr 21 15:35:26.913035 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.913014 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 15:35:26.913169 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.913021 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qxmnr\"" Apr 21 15:35:26.913657 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.913639 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jddzn" Apr 21 15:35:26.914584 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.914568 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 15:35:26.914878 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.914847 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4fz95\"" Apr 21 15:35:26.915028 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.915010 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 15:35:26.915281 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.915265 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 15:35:26.915821 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.915805 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5ftvd\"" Apr 21 15:35:26.916693 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.916675 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 15:35:26.916825 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.916805 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 15:35:26.917089 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.917074 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 15:35:26.924381 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924351 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-system-cni-dir\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.924505 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924390 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:26.924505 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924422 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-run-ovn-kubernetes\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.924505 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924451 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-sysctl-conf\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.924505 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924491 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:26.924732 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924538 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-cnibin\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.924732 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924579 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-run-multus-certs\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.924732 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdmr7\" (UniqueName: \"kubernetes.io/projected/50b93a7e-ace3-4153-b2d3-ea527a654b34-kube-api-access-zdmr7\") pod \"node-resolver-9b9sq\" (UID: \"50b93a7e-ace3-4153-b2d3-ea527a654b34\") " pod="openshift-dns/node-resolver-9b9sq" Apr 21 15:35:26.924732 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924632 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/822fd92b-9cc2-44e7-972a-9b68cde8ab9a-konnectivity-ca\") pod \"konnectivity-agent-6l9p8\" (UID: \"822fd92b-9cc2-44e7-972a-9b68cde8ab9a\") " pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:26.924732 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924655 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-run-ovn\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.924732 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924681 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-sysctl-d\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.925017 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2d06a3a-0637-4a19-b2ba-af896d234845-cni-binary-copy\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.925017 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924790 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-tuned\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.925017 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924817 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhq9\" (UniqueName: \"kubernetes.io/projected/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-kube-api-access-dbhq9\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.925017 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924855 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-system-cni-dir\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:26.925017 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924898 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de4d7367-97b7-475a-b70f-d1b1f47d5106-serviceca\") pod \"node-ca-jddzn\" (UID: \"de4d7367-97b7-475a-b70f-d1b1f47d5106\") " pod="openshift-image-registry/node-ca-jddzn" Apr 21 15:35:26.925017 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924940 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-run-systemd\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.925017 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.924973 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-var-lib-openvswitch\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.925017 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925002 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-etc-openvswitch\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925025 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-sys-fs\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925052 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-systemd-units\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925074 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnsxg\" (UniqueName: \"kubernetes.io/projected/125effdd-7036-4d2f-ae23-d0516355b243-kube-api-access-xnsxg\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925097 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-sysconfig\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925121 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/402ecfa4-798f-4e6f-9d15-5c6ef953439a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925146 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-cni-dir\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-socket-dir-parent\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925207 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-run-netns\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925233 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-conf-dir\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925259 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/50b93a7e-ace3-4153-b2d3-ea527a654b34-hosts-file\") pod \"node-resolver-9b9sq\" (UID: \"50b93a7e-ace3-4153-b2d3-ea527a654b34\") " pod="openshift-dns/node-resolver-9b9sq" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925306 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-node-log\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925342 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-log-socket\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925369 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jzd\" (UniqueName: \"kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd\") pod \"network-check-target-ct78s\" (UID: \"bbf400a9-66da-48f5-ba51-6ecd75c50fa2\") " pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:26.925410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925406 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/50b93a7e-ace3-4153-b2d3-ea527a654b34-tmp-dir\") pod \"node-resolver-9b9sq\" (UID: \"50b93a7e-ace3-4153-b2d3-ea527a654b34\") " pod="openshift-dns/node-resolver-9b9sq" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925451 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-systemd\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925475 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-etc-kubernetes\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d-iptables-alerter-script\") pod \"iptables-alerter-84tg6\" (UID: \"62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d\") " pod="openshift-network-operator/iptables-alerter-84tg6" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925544 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d-host-slash\") pod \"iptables-alerter-84tg6\" (UID: \"62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d\") " pod="openshift-network-operator/iptables-alerter-84tg6" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925570 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/822fd92b-9cc2-44e7-972a-9b68cde8ab9a-agent-certs\") pod \"konnectivity-agent-6l9p8\" (UID: \"822fd92b-9cc2-44e7-972a-9b68cde8ab9a\") " pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925596 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-cni-netd\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925619 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925644 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-modprobe-d\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925671 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-sys\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925694 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-lib-modules\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925716 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-tmp\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925758 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/402ecfa4-798f-4e6f-9d15-5c6ef953439a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925788 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-run-netns\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925854 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-registration-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:26.926121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925902 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74jjq\" (UniqueName: \"kubernetes.io/projected/402ecfa4-798f-4e6f-9d15-5c6ef953439a-kube-api-access-74jjq\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925941 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-os-release\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.925976 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-var-lib-cni-bin\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926007 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de4d7367-97b7-475a-b70f-d1b1f47d5106-host\") pod \"node-ca-jddzn\" (UID: \"de4d7367-97b7-475a-b70f-d1b1f47d5106\") " pod="openshift-image-registry/node-ca-jddzn" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926037 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-hostroot\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926063 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-run-openvswitch\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926104 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-host\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-cni-bin\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926155 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4z5n\" (UniqueName: \"kubernetes.io/projected/7b0083d8-b152-40fa-9a89-e3180ed1747d-kube-api-access-q4z5n\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926178 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-run\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-run-k8s-cni-cncf-io\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-var-lib-kubelet\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-daemon-config\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926269 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4bvk\" (UniqueName: \"kubernetes.io/projected/893ee07d-ac5e-4593-93fd-80655b690072-kube-api-access-x4bvk\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926292 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-slash\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926316 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b0083d8-b152-40fa-9a89-e3180ed1747d-env-overrides\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926340 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-socket-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:26.926936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926369 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-etc-selinux\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-kubernetes\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926418 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb5qn\" (UniqueName: \"kubernetes.io/projected/c2d06a3a-0637-4a19-b2ba-af896d234845-kube-api-access-lb5qn\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926442 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp6x5\" (UniqueName: \"kubernetes.io/projected/62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d-kube-api-access-pp6x5\") pod \"iptables-alerter-84tg6\" (UID: \"62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d\") " pod="openshift-network-operator/iptables-alerter-84tg6" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926466 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb47f\" (UniqueName: \"kubernetes.io/projected/de4d7367-97b7-475a-b70f-d1b1f47d5106-kube-api-access-qb47f\") pod \"node-ca-jddzn\" (UID: \"de4d7367-97b7-475a-b70f-d1b1f47d5106\") " pod="openshift-image-registry/node-ca-jddzn" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926517 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-cnibin\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926541 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b0083d8-b152-40fa-9a89-e3180ed1747d-ovnkube-config\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-kubelet\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926591 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b0083d8-b152-40fa-9a89-e3180ed1747d-ovn-node-metrics-cert\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926639 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b0083d8-b152-40fa-9a89-e3180ed1747d-ovnkube-script-lib\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926664 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-device-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926692 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-var-lib-kubelet\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926716 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-os-release\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926886 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/402ecfa4-798f-4e6f-9d15-5c6ef953439a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:26.927771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.926916 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-var-lib-cni-multus\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:26.959689 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.959657 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:25 +0000 UTC" deadline="2028-01-27 01:23:41.945854386 +0000 UTC" Apr 21 15:35:26.959689 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:26.959687 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15489h48m14.986170497s" Apr 21 15:35:27.027697 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027664 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-run-netns\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.027697 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027696 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-conf-dir\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.027934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/50b93a7e-ace3-4153-b2d3-ea527a654b34-hosts-file\") pod \"node-resolver-9b9sq\" (UID: \"50b93a7e-ace3-4153-b2d3-ea527a654b34\") " pod="openshift-dns/node-resolver-9b9sq" Apr 21 15:35:27.027934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027746 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-node-log\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.027934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027756 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-run-netns\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.027934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-log-socket\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.027934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027770 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-conf-dir\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.027934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027803 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-log-socket\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.027934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027805 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jzd\" (UniqueName: \"kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd\") pod \"network-check-target-ct78s\" (UID: \"bbf400a9-66da-48f5-ba51-6ecd75c50fa2\") " pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:27.027934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027834 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-node-log\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.027934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027836 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/50b93a7e-ace3-4153-b2d3-ea527a654b34-tmp-dir\") pod \"node-resolver-9b9sq\" (UID: \"50b93a7e-ace3-4153-b2d3-ea527a654b34\") " pod="openshift-dns/node-resolver-9b9sq" Apr 21 15:35:27.027934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/50b93a7e-ace3-4153-b2d3-ea527a654b34-hosts-file\") pod \"node-resolver-9b9sq\" (UID: \"50b93a7e-ace3-4153-b2d3-ea527a654b34\") " pod="openshift-dns/node-resolver-9b9sq" Apr 21 15:35:27.027934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-systemd\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-etc-kubernetes\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.027992 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d-iptables-alerter-script\") pod \"iptables-alerter-84tg6\" (UID: \"62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d\") " pod="openshift-network-operator/iptables-alerter-84tg6" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028010 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d-host-slash\") pod \"iptables-alerter-84tg6\" (UID: \"62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d\") " pod="openshift-network-operator/iptables-alerter-84tg6" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028036 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/822fd92b-9cc2-44e7-972a-9b68cde8ab9a-agent-certs\") pod \"konnectivity-agent-6l9p8\" (UID: \"822fd92b-9cc2-44e7-972a-9b68cde8ab9a\") " pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028056 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-etc-kubernetes\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028058 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-cni-netd\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028095 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-cni-netd\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028135 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028165 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-modprobe-d\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028167 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/50b93a7e-ace3-4153-b2d3-ea527a654b34-tmp-dir\") pod \"node-resolver-9b9sq\" (UID: \"50b93a7e-ace3-4153-b2d3-ea527a654b34\") " pod="openshift-dns/node-resolver-9b9sq" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-sys\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028189 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028214 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-lib-modules\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028249 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-systemd\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.028427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028305 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-modprobe-d\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028334 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-sys\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028350 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d-host-slash\") pod \"iptables-alerter-84tg6\" (UID: \"62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d\") " pod="openshift-network-operator/iptables-alerter-84tg6" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028359 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-tmp\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028387 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/402ecfa4-798f-4e6f-9d15-5c6ef953439a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-run-netns\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028435 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-registration-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028462 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74jjq\" (UniqueName: \"kubernetes.io/projected/402ecfa4-798f-4e6f-9d15-5c6ef953439a-kube-api-access-74jjq\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-os-release\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028509 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-run-netns\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028458 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-lib-modules\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-var-lib-cni-bin\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028557 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028566 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-var-lib-cni-bin\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028572 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d-iptables-alerter-script\") pod \"iptables-alerter-84tg6\" (UID: \"62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d\") " pod="openshift-network-operator/iptables-alerter-84tg6" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de4d7367-97b7-475a-b70f-d1b1f47d5106-host\") pod \"node-ca-jddzn\" (UID: \"de4d7367-97b7-475a-b70f-d1b1f47d5106\") " pod="openshift-image-registry/node-ca-jddzn" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028593 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de4d7367-97b7-475a-b70f-d1b1f47d5106-host\") pod \"node-ca-jddzn\" (UID: \"de4d7367-97b7-475a-b70f-d1b1f47d5106\") " pod="openshift-image-registry/node-ca-jddzn" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-hostroot\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.029191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-hostroot\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028651 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-os-release\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028577 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-registration-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-run-openvswitch\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028690 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-run-openvswitch\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028692 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-host\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028724 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-host\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028729 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-cni-bin\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028761 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4z5n\" (UniqueName: \"kubernetes.io/projected/7b0083d8-b152-40fa-9a89-e3180ed1747d-kube-api-access-q4z5n\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028775 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-cni-bin\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028791 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-run\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028818 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-run-k8s-cni-cncf-io\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028844 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-var-lib-kubelet\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028849 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-run\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-daemon-config\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028893 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-run-k8s-cni-cncf-io\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028896 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4bvk\" (UniqueName: \"kubernetes.io/projected/893ee07d-ac5e-4593-93fd-80655b690072-kube-api-access-x4bvk\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028906 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/402ecfa4-798f-4e6f-9d15-5c6ef953439a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.029952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028924 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-var-lib-kubelet\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028928 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-slash\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028957 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-slash\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b0083d8-b152-40fa-9a89-e3180ed1747d-env-overrides\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.028993 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-socket-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-etc-selinux\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029043 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-kubernetes\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029068 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lb5qn\" (UniqueName: \"kubernetes.io/projected/c2d06a3a-0637-4a19-b2ba-af896d234845-kube-api-access-lb5qn\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029093 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp6x5\" (UniqueName: \"kubernetes.io/projected/62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d-kube-api-access-pp6x5\") pod \"iptables-alerter-84tg6\" (UID: \"62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d\") " pod="openshift-network-operator/iptables-alerter-84tg6" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029115 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb47f\" (UniqueName: \"kubernetes.io/projected/de4d7367-97b7-475a-b70f-d1b1f47d5106-kube-api-access-qb47f\") pod \"node-ca-jddzn\" (UID: \"de4d7367-97b7-475a-b70f-d1b1f47d5106\") " pod="openshift-image-registry/node-ca-jddzn" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029139 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-cnibin\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029163 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b0083d8-b152-40fa-9a89-e3180ed1747d-ovnkube-config\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029168 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-socket-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-kubelet\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029211 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b0083d8-b152-40fa-9a89-e3180ed1747d-ovn-node-metrics-cert\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029236 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b0083d8-b152-40fa-9a89-e3180ed1747d-ovnkube-script-lib\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029258 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-device-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.030805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029282 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-var-lib-kubelet\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029306 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-os-release\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029367 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-device-dir\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029409 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-var-lib-kubelet\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029433 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-daemon-config\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029505 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-kubelet\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029546 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-cnibin\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029555 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-etc-selinux\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029576 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/402ecfa4-798f-4e6f-9d15-5c6ef953439a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029603 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-var-lib-cni-multus\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029610 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-kubernetes\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b0083d8-b152-40fa-9a89-e3180ed1747d-env-overrides\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029630 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-system-cni-dir\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029669 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-os-release\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029680 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-system-cni-dir\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029719 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-var-lib-cni-multus\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029766 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:27.031580 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029797 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-run-ovn-kubernetes\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029823 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-sysctl-conf\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.029871 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029879 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-cnibin\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029903 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-run-multus-certs\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.029930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdmr7\" (UniqueName: \"kubernetes.io/projected/50b93a7e-ace3-4153-b2d3-ea527a654b34-kube-api-access-zdmr7\") pod \"node-resolver-9b9sq\" (UID: \"50b93a7e-ace3-4153-b2d3-ea527a654b34\") " pod="openshift-dns/node-resolver-9b9sq" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.029991 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs podName:893ee07d-ac5e-4593-93fd-80655b690072 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:27.529923588 +0000 UTC m=+3.101956515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs") pod "network-metrics-daemon-lp64c" (UID: "893ee07d-ac5e-4593-93fd-80655b690072") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030024 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/822fd92b-9cc2-44e7-972a-9b68cde8ab9a-konnectivity-ca\") pod \"konnectivity-agent-6l9p8\" (UID: \"822fd92b-9cc2-44e7-972a-9b68cde8ab9a\") " pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-run-ovn\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/402ecfa4-798f-4e6f-9d15-5c6ef953439a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-sysctl-d\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030103 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2d06a3a-0637-4a19-b2ba-af896d234845-cni-binary-copy\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030127 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-tuned\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-host-run-multus-certs\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030160 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbhq9\" (UniqueName: \"kubernetes.io/projected/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-kube-api-access-dbhq9\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030210 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-system-cni-dir\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.032300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-cnibin\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030362 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-system-cni-dir\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030397 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de4d7367-97b7-475a-b70f-d1b1f47d5106-serviceca\") pod \"node-ca-jddzn\" (UID: \"de4d7367-97b7-475a-b70f-d1b1f47d5106\") " pod="openshift-image-registry/node-ca-jddzn" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030427 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-run-systemd\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-var-lib-openvswitch\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030498 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-etc-openvswitch\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-sys-fs\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030553 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-systemd-units\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030617 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnsxg\" (UniqueName: \"kubernetes.io/projected/125effdd-7036-4d2f-ae23-d0516355b243-kube-api-access-xnsxg\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030674 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-sysconfig\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030729 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-sysconfig\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030764 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-systemd-units\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030770 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/402ecfa4-798f-4e6f-9d15-5c6ef953439a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030811 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-run-systemd\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030832 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-host-run-ovn-kubernetes\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030872 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-etc-openvswitch\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030905 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-var-lib-openvswitch\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.030987 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-sysctl-d\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031032 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/125effdd-7036-4d2f-ae23-d0516355b243-sys-fs\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031171 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2d06a3a-0637-4a19-b2ba-af896d234845-cni-binary-copy\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031176 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de4d7367-97b7-475a-b70f-d1b1f47d5106-serviceca\") pod \"node-ca-jddzn\" (UID: \"de4d7367-97b7-475a-b70f-d1b1f47d5106\") " pod="openshift-image-registry/node-ca-jddzn" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/402ecfa4-798f-4e6f-9d15-5c6ef953439a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b0083d8-b152-40fa-9a89-e3180ed1747d-run-ovn\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031282 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-sysctl-conf\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031303 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-cni-dir\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031353 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-cni-dir\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031404 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-socket-dir-parent\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031552 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/822fd92b-9cc2-44e7-972a-9b68cde8ab9a-konnectivity-ca\") pod \"konnectivity-agent-6l9p8\" (UID: \"822fd92b-9cc2-44e7-972a-9b68cde8ab9a\") " pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031566 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c2d06a3a-0637-4a19-b2ba-af896d234845-multus-socket-dir-parent\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.031965 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b0083d8-b152-40fa-9a89-e3180ed1747d-ovnkube-config\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.032385 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b0083d8-b152-40fa-9a89-e3180ed1747d-ovnkube-script-lib\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.032632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-tmp\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.032736 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b0083d8-b152-40fa-9a89-e3180ed1747d-ovn-node-metrics-cert\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.033973 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.032845 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/822fd92b-9cc2-44e7-972a-9b68cde8ab9a-agent-certs\") pod \"konnectivity-agent-6l9p8\" (UID: \"822fd92b-9cc2-44e7-972a-9b68cde8ab9a\") " pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:27.034898 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.034192 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-etc-tuned\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.035266 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.035245 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:27.035266 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.035269 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:27.035421 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.035281 2569 projected.go:194] Error preparing data for projected volume kube-api-access-v2jzd for pod openshift-network-diagnostics/network-check-target-ct78s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:27.035421 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.035341 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd podName:bbf400a9-66da-48f5-ba51-6ecd75c50fa2 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:27.535324885 +0000 UTC m=+3.107357810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-v2jzd" (UniqueName: "kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd") pod "network-check-target-ct78s" (UID: "bbf400a9-66da-48f5-ba51-6ecd75c50fa2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:27.039874 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.039830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/402ecfa4-798f-4e6f-9d15-5c6ef953439a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.044304 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.044278 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp6x5\" (UniqueName: \"kubernetes.io/projected/62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d-kube-api-access-pp6x5\") pod \"iptables-alerter-84tg6\" (UID: \"62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d\") " pod="openshift-network-operator/iptables-alerter-84tg6" Apr 21 15:35:27.045539 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.045423 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74jjq\" (UniqueName: \"kubernetes.io/projected/402ecfa4-798f-4e6f-9d15-5c6ef953439a-kube-api-access-74jjq\") pod \"multus-additional-cni-plugins-xwpr7\" (UID: \"402ecfa4-798f-4e6f-9d15-5c6ef953439a\") " pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.045916 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.045892 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbhq9\" (UniqueName: \"kubernetes.io/projected/3e9c36e6-7be3-4c71-a7e3-e34aa989258a-kube-api-access-dbhq9\") pod \"tuned-m6tzn\" (UID: \"3e9c36e6-7be3-4c71-a7e3-e34aa989258a\") " pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.047903 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.047878 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb47f\" (UniqueName: \"kubernetes.io/projected/de4d7367-97b7-475a-b70f-d1b1f47d5106-kube-api-access-qb47f\") pod \"node-ca-jddzn\" (UID: \"de4d7367-97b7-475a-b70f-d1b1f47d5106\") " pod="openshift-image-registry/node-ca-jddzn" Apr 21 15:35:27.048407 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.048384 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnsxg\" (UniqueName: \"kubernetes.io/projected/125effdd-7036-4d2f-ae23-d0516355b243-kube-api-access-xnsxg\") pod \"aws-ebs-csi-driver-node-vkgzg\" (UID: \"125effdd-7036-4d2f-ae23-d0516355b243\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.048810 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.048785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdmr7\" (UniqueName: \"kubernetes.io/projected/50b93a7e-ace3-4153-b2d3-ea527a654b34-kube-api-access-zdmr7\") pod \"node-resolver-9b9sq\" (UID: \"50b93a7e-ace3-4153-b2d3-ea527a654b34\") " pod="openshift-dns/node-resolver-9b9sq" Apr 21 15:35:27.049164 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.049142 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4bvk\" (UniqueName: \"kubernetes.io/projected/893ee07d-ac5e-4593-93fd-80655b690072-kube-api-access-x4bvk\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:27.049530 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.049511 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb5qn\" (UniqueName: \"kubernetes.io/projected/c2d06a3a-0637-4a19-b2ba-af896d234845-kube-api-access-lb5qn\") pod \"multus-cx47v\" (UID: \"c2d06a3a-0637-4a19-b2ba-af896d234845\") " pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.050163 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.050137 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4z5n\" (UniqueName: \"kubernetes.io/projected/7b0083d8-b152-40fa-9a89-e3180ed1747d-kube-api-access-q4z5n\") pod \"ovnkube-node-wkxqg\" (UID: \"7b0083d8-b152-40fa-9a89-e3180ed1747d\") " pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.134265 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.134237 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:27.214376 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.214338 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:27.222282 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.222258 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-84tg6" Apr 21 15:35:27.237186 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.237154 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:27.244963 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.244931 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" Apr 21 15:35:27.250620 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.250594 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" Apr 21 15:35:27.257328 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.257309 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xwpr7" Apr 21 15:35:27.265029 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.265009 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cx47v" Apr 21 15:35:27.271618 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.271601 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9b9sq" Apr 21 15:35:27.278186 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.278163 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jddzn" Apr 21 15:35:27.318607 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:27.318548 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod402ecfa4_798f_4e6f_9d15_5c6ef953439a.slice/crio-fb54d4a0c21e5fadf9f7b03ff94ec6707bad4639844577fe3bce690b5eec793c WatchSource:0}: Error finding container fb54d4a0c21e5fadf9f7b03ff94ec6707bad4639844577fe3bce690b5eec793c: Status 404 returned error can't find the container with id fb54d4a0c21e5fadf9f7b03ff94ec6707bad4639844577fe3bce690b5eec793c Apr 21 15:35:27.319361 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:27.319176 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50b93a7e_ace3_4153_b2d3_ea527a654b34.slice/crio-27afd1f54f52533d5a54d8567b08ba62a0b834f7dfb65b4ea3897fb0b48ff49e WatchSource:0}: Error finding container 27afd1f54f52533d5a54d8567b08ba62a0b834f7dfb65b4ea3897fb0b48ff49e: Status 404 returned error can't find the container with id 27afd1f54f52533d5a54d8567b08ba62a0b834f7dfb65b4ea3897fb0b48ff49e Apr 21 15:35:27.322367 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:27.322337 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e9c36e6_7be3_4c71_a7e3_e34aa989258a.slice/crio-5d6cadb649db2f8cae51e39d19d49be78df14acef1bbfe8c752b626a0f173b28 WatchSource:0}: Error finding container 5d6cadb649db2f8cae51e39d19d49be78df14acef1bbfe8c752b626a0f173b28: Status 404 returned error can't find the container with id 5d6cadb649db2f8cae51e39d19d49be78df14acef1bbfe8c752b626a0f173b28 Apr 21 15:35:27.325520 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:27.325293 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b0083d8_b152_40fa_9a89_e3180ed1747d.slice/crio-50130997156c4a2a88b80be654a2aa6a9639b8590b6a2a6de4439be4c1aa9994 WatchSource:0}: Error finding container 50130997156c4a2a88b80be654a2aa6a9639b8590b6a2a6de4439be4c1aa9994: Status 404 returned error can't find the container with id 50130997156c4a2a88b80be654a2aa6a9639b8590b6a2a6de4439be4c1aa9994 Apr 21 15:35:27.327974 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:27.327953 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod125effdd_7036_4d2f_ae23_d0516355b243.slice/crio-ffacf635ed07743ffcc2c22c1b823126afb2cff7166148841a692ce68083ee2f WatchSource:0}: Error finding container ffacf635ed07743ffcc2c22c1b823126afb2cff7166148841a692ce68083ee2f: Status 404 returned error can't find the container with id ffacf635ed07743ffcc2c22c1b823126afb2cff7166148841a692ce68083ee2f Apr 21 15:35:27.328969 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:27.328880 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod822fd92b_9cc2_44e7_972a_9b68cde8ab9a.slice/crio-ae3e398683057dd4f160eeb417ed32e6c302d437e37f960d7878bf51a26f6f30 WatchSource:0}: Error finding container ae3e398683057dd4f160eeb417ed32e6c302d437e37f960d7878bf51a26f6f30: Status 404 returned error can't find the container with id ae3e398683057dd4f160eeb417ed32e6c302d437e37f960d7878bf51a26f6f30 Apr 21 15:35:27.331032 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:35:27.330288 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62ea1e4f_5379_4cf2_88a7_8b037ce9dd1d.slice/crio-85e2af2904007d033ed86084ed45dd7a5a0cb03caf7ff28081dfc2e576299dd0 WatchSource:0}: Error finding container 85e2af2904007d033ed86084ed45dd7a5a0cb03caf7ff28081dfc2e576299dd0: Status 404 returned error can't find the container with id 85e2af2904007d033ed86084ed45dd7a5a0cb03caf7ff28081dfc2e576299dd0 Apr 21 15:35:27.332442 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.332282 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:27.534939 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.534691 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:27.534939 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.534829 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:27.535104 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.534964 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs podName:893ee07d-ac5e-4593-93fd-80655b690072 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:28.534948451 +0000 UTC m=+4.106981372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs") pod "network-metrics-daemon-lp64c" (UID: "893ee07d-ac5e-4593-93fd-80655b690072") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:27.635458 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.635423 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jzd\" (UniqueName: \"kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd\") pod \"network-check-target-ct78s\" (UID: \"bbf400a9-66da-48f5-ba51-6ecd75c50fa2\") " pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:27.635641 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.635609 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:27.635641 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.635632 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:27.635743 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.635647 2569 projected.go:194] Error preparing data for projected volume kube-api-access-v2jzd for pod openshift-network-diagnostics/network-check-target-ct78s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:27.635743 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:27.635705 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd podName:bbf400a9-66da-48f5-ba51-6ecd75c50fa2 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:28.635685187 +0000 UTC m=+4.207718124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2jzd" (UniqueName: "kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd") pod "network-check-target-ct78s" (UID: "bbf400a9-66da-48f5-ba51-6ecd75c50fa2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:27.960618 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.960546 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:25 +0000 UTC" deadline="2028-01-14 19:16:34.289006554 +0000 UTC" Apr 21 15:35:27.960618 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:27.960584 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15195h41m6.328425544s" Apr 21 15:35:28.050206 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.049516 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:28.050206 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:28.049649 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:28.050206 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.050054 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:28.050206 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:28.050142 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:28.084667 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.084623 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jddzn" event={"ID":"de4d7367-97b7-475a-b70f-d1b1f47d5106","Type":"ContainerStarted","Data":"3c0f5300ada75feada3035e4428e31d2be5559dd2e980207da8096884ee83d2c"} Apr 21 15:35:28.093940 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.093906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6l9p8" event={"ID":"822fd92b-9cc2-44e7-972a-9b68cde8ab9a","Type":"ContainerStarted","Data":"ae3e398683057dd4f160eeb417ed32e6c302d437e37f960d7878bf51a26f6f30"} Apr 21 15:35:28.103738 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.103702 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" event={"ID":"7b0083d8-b152-40fa-9a89-e3180ed1747d","Type":"ContainerStarted","Data":"50130997156c4a2a88b80be654a2aa6a9639b8590b6a2a6de4439be4c1aa9994"} Apr 21 15:35:28.106728 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.106679 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" event={"ID":"125effdd-7036-4d2f-ae23-d0516355b243","Type":"ContainerStarted","Data":"ffacf635ed07743ffcc2c22c1b823126afb2cff7166148841a692ce68083ee2f"} Apr 21 15:35:28.118407 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.118313 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9b9sq" event={"ID":"50b93a7e-ace3-4153-b2d3-ea527a654b34","Type":"ContainerStarted","Data":"27afd1f54f52533d5a54d8567b08ba62a0b834f7dfb65b4ea3897fb0b48ff49e"} Apr 21 15:35:28.120860 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.120798 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwpr7" event={"ID":"402ecfa4-798f-4e6f-9d15-5c6ef953439a","Type":"ContainerStarted","Data":"fb54d4a0c21e5fadf9f7b03ff94ec6707bad4639844577fe3bce690b5eec793c"} Apr 21 15:35:28.129392 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.128715 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-232.ec2.internal" event={"ID":"fca69f1fb3124857e02406d9db421c8d","Type":"ContainerStarted","Data":"8d8e41a934b4d5b1d2bf16f9ea1b1a8818ffd1d45b085c0ae31c920a36767e11"} Apr 21 15:35:28.141195 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.141160 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cx47v" event={"ID":"c2d06a3a-0637-4a19-b2ba-af896d234845","Type":"ContainerStarted","Data":"3a850245fdb1fe6f14a82ff5fa9a60b6237515f9e8fa8df389e0f35a73ca5d9b"} Apr 21 15:35:28.151359 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.151328 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-84tg6" event={"ID":"62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d","Type":"ContainerStarted","Data":"85e2af2904007d033ed86084ed45dd7a5a0cb03caf7ff28081dfc2e576299dd0"} Apr 21 15:35:28.154643 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.154611 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" event={"ID":"3e9c36e6-7be3-4c71-a7e3-e34aa989258a","Type":"ContainerStarted","Data":"5d6cadb649db2f8cae51e39d19d49be78df14acef1bbfe8c752b626a0f173b28"} Apr 21 15:35:28.544573 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.544533 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:28.544747 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:28.544672 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:28.544747 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:28.544739 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs podName:893ee07d-ac5e-4593-93fd-80655b690072 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:30.544719643 +0000 UTC m=+6.116752566 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs") pod "network-metrics-daemon-lp64c" (UID: "893ee07d-ac5e-4593-93fd-80655b690072") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:28.645826 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:28.645780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jzd\" (UniqueName: \"kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd\") pod \"network-check-target-ct78s\" (UID: \"bbf400a9-66da-48f5-ba51-6ecd75c50fa2\") " pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:28.646004 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:28.645991 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:28.646079 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:28.646012 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:28.646079 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:28.646025 2569 projected.go:194] Error preparing data for projected volume kube-api-access-v2jzd for pod openshift-network-diagnostics/network-check-target-ct78s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:28.646190 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:28.646083 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd podName:bbf400a9-66da-48f5-ba51-6ecd75c50fa2 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:30.646065048 +0000 UTC m=+6.218097974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2jzd" (UniqueName: "kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd") pod "network-check-target-ct78s" (UID: "bbf400a9-66da-48f5-ba51-6ecd75c50fa2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:30.049980 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:30.049899 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:30.050459 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:30.050041 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:30.050560 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:30.050511 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:30.050633 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:30.050614 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:30.564329 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:30.564287 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:30.564586 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:30.564435 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:30.564586 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:30.564518 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs podName:893ee07d-ac5e-4593-93fd-80655b690072 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:34.564498003 +0000 UTC m=+10.136530930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs") pod "network-metrics-daemon-lp64c" (UID: "893ee07d-ac5e-4593-93fd-80655b690072") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:30.665224 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:30.665188 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jzd\" (UniqueName: \"kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd\") pod \"network-check-target-ct78s\" (UID: \"bbf400a9-66da-48f5-ba51-6ecd75c50fa2\") " pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:30.665403 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:30.665373 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:30.665403 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:30.665401 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:30.665631 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:30.665415 2569 projected.go:194] Error preparing data for projected volume kube-api-access-v2jzd for pod openshift-network-diagnostics/network-check-target-ct78s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:30.665631 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:30.665495 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd podName:bbf400a9-66da-48f5-ba51-6ecd75c50fa2 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:34.665461275 +0000 UTC m=+10.237494195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2jzd" (UniqueName: "kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd") pod "network-check-target-ct78s" (UID: "bbf400a9-66da-48f5-ba51-6ecd75c50fa2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:32.050297 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:32.050234 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:32.050795 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:32.050234 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:32.050795 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:32.050384 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:32.050795 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:32.050461 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:34.049811 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:34.049779 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:34.050280 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:34.049929 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:34.050280 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:34.049779 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:34.050390 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:34.050303 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:34.598205 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:34.598147 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:34.598393 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:34.598353 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:34.598458 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:34.598435 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs podName:893ee07d-ac5e-4593-93fd-80655b690072 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:42.598412339 +0000 UTC m=+18.170445263 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs") pod "network-metrics-daemon-lp64c" (UID: "893ee07d-ac5e-4593-93fd-80655b690072") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:34.698807 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:34.698763 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jzd\" (UniqueName: \"kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd\") pod \"network-check-target-ct78s\" (UID: \"bbf400a9-66da-48f5-ba51-6ecd75c50fa2\") " pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:34.698967 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:34.698951 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:34.699041 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:34.698971 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:34.699041 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:34.698983 2569 projected.go:194] Error preparing data for projected volume kube-api-access-v2jzd for pod openshift-network-diagnostics/network-check-target-ct78s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:34.699133 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:34.699042 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd podName:bbf400a9-66da-48f5-ba51-6ecd75c50fa2 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:42.69902414 +0000 UTC m=+18.271057062 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2jzd" (UniqueName: "kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd") pod "network-check-target-ct78s" (UID: "bbf400a9-66da-48f5-ba51-6ecd75c50fa2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:36.050174 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.050132 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:36.050644 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.050132 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:36.050644 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:36.050281 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:36.050644 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:36.050352 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:36.087667 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.087606 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-232.ec2.internal" podStartSLOduration=10.087588728 podStartE2EDuration="10.087588728s" podCreationTimestamp="2026-04-21 15:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:28.146633823 +0000 UTC m=+3.718666768" watchObservedRunningTime="2026-04-21 15:35:36.087588728 +0000 UTC m=+11.659621670" Apr 21 15:35:36.088212 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.088189 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4mz5z"] Apr 21 15:35:36.091436 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.091413 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:36.091540 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:36.091496 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:36.206864 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.206834 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/324dbbf6-3ab0-424a-a580-24aa71c13cb6-dbus\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:36.206864 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.206872 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/324dbbf6-3ab0-424a-a580-24aa71c13cb6-kubelet-config\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:36.207066 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.206900 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:36.307840 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.307751 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/324dbbf6-3ab0-424a-a580-24aa71c13cb6-dbus\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:36.307840 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.307794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/324dbbf6-3ab0-424a-a580-24aa71c13cb6-kubelet-config\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:36.307840 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.307816 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:36.308101 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:36.307956 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:36.308101 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.307962 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/324dbbf6-3ab0-424a-a580-24aa71c13cb6-dbus\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:36.308101 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:36.308013 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret podName:324dbbf6-3ab0-424a-a580-24aa71c13cb6 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:36.807995334 +0000 UTC m=+12.380028254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret") pod "global-pull-secret-syncer-4mz5z" (UID: "324dbbf6-3ab0-424a-a580-24aa71c13cb6") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:36.308101 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.308028 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/324dbbf6-3ab0-424a-a580-24aa71c13cb6-kubelet-config\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:36.809827 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:36.809789 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:36.810013 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:36.809968 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:36.810071 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:36.810057 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret podName:324dbbf6-3ab0-424a-a580-24aa71c13cb6 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:37.810033277 +0000 UTC m=+13.382066210 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret") pod "global-pull-secret-syncer-4mz5z" (UID: "324dbbf6-3ab0-424a-a580-24aa71c13cb6") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:37.819037 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:37.818994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:37.819438 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:37.819116 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:37.819438 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:37.819176 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret podName:324dbbf6-3ab0-424a-a580-24aa71c13cb6 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:39.819159753 +0000 UTC m=+15.391192674 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret") pod "global-pull-secret-syncer-4mz5z" (UID: "324dbbf6-3ab0-424a-a580-24aa71c13cb6") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:38.050395 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:38.050361 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:38.050587 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:38.050435 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:38.050653 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:38.050569 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:38.050653 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:38.050630 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:38.050751 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:38.050716 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:38.050811 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:38.050792 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:39.832994 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:39.832946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:39.833396 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:39.833113 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:39.833396 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:39.833196 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret podName:324dbbf6-3ab0-424a-a580-24aa71c13cb6 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:43.833174153 +0000 UTC m=+19.405207087 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret") pod "global-pull-secret-syncer-4mz5z" (UID: "324dbbf6-3ab0-424a-a580-24aa71c13cb6") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:40.049645 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:40.049613 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:40.049794 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:40.049716 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:40.049884 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:40.049861 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:40.049997 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:40.049912 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:40.050035 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:40.049987 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:40.050067 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:40.050045 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:42.050275 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:42.050239 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:42.050755 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:42.050239 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:42.050755 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:42.050379 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:42.050755 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:42.050461 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:42.050755 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:42.050239 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:42.050755 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:42.050644 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:42.654601 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:42.654561 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:42.654776 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:42.654718 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:42.654855 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:42.654786 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs podName:893ee07d-ac5e-4593-93fd-80655b690072 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:58.654766666 +0000 UTC m=+34.226799588 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs") pod "network-metrics-daemon-lp64c" (UID: "893ee07d-ac5e-4593-93fd-80655b690072") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:42.755164 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:42.755129 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jzd\" (UniqueName: \"kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd\") pod \"network-check-target-ct78s\" (UID: \"bbf400a9-66da-48f5-ba51-6ecd75c50fa2\") " pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:42.755348 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:42.755289 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:42.755348 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:42.755305 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:42.755348 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:42.755315 2569 projected.go:194] Error preparing data for projected volume kube-api-access-v2jzd for pod openshift-network-diagnostics/network-check-target-ct78s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:42.755473 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:42.755375 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd podName:bbf400a9-66da-48f5-ba51-6ecd75c50fa2 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:58.755356368 +0000 UTC m=+34.327389294 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2jzd" (UniqueName: "kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd") pod "network-check-target-ct78s" (UID: "bbf400a9-66da-48f5-ba51-6ecd75c50fa2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:43.862244 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:43.862201 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:43.862715 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:43.862331 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:43.862715 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:43.862387 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret podName:324dbbf6-3ab0-424a-a580-24aa71c13cb6 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:51.862369712 +0000 UTC m=+27.434402632 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret") pod "global-pull-secret-syncer-4mz5z" (UID: "324dbbf6-3ab0-424a-a580-24aa71c13cb6") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:44.050074 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:44.050035 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:44.050242 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:44.050036 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:44.050242 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:44.050172 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:44.050342 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:44.050042 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:44.050342 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:44.050263 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:44.050455 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:44.050353 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:46.049814 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.049462 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:46.050474 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.049520 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:46.050474 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:46.049909 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:46.050474 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.049544 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:46.050474 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:46.049958 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:46.050474 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:46.050052 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:46.194768 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.194736 2569 generic.go:358] "Generic (PLEG): container finished" podID="0a0b3ad87f77adee4d1e995d037c7e36" containerID="963c99a3ec4488019cda31e49ca23b1c3b95655a8f3aa6e5417ac250f1df3875" exitCode=0 Apr 21 15:35:46.194961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.194799 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" event={"ID":"0a0b3ad87f77adee4d1e995d037c7e36","Type":"ContainerDied","Data":"963c99a3ec4488019cda31e49ca23b1c3b95655a8f3aa6e5417ac250f1df3875"} Apr 21 15:35:46.196156 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.196131 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cx47v" event={"ID":"c2d06a3a-0637-4a19-b2ba-af896d234845","Type":"ContainerStarted","Data":"e45a2461276faf71d68224a69e89bfabec9862f76640379179466813fecbd479"} Apr 21 15:35:46.197549 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.197467 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" event={"ID":"3e9c36e6-7be3-4c71-a7e3-e34aa989258a","Type":"ContainerStarted","Data":"e8a0cade534113d15b7ffa699f6b2990f11bc0c359ef143d3bd55356b05e59da"} Apr 21 15:35:46.198749 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.198699 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jddzn" event={"ID":"de4d7367-97b7-475a-b70f-d1b1f47d5106","Type":"ContainerStarted","Data":"2984c0cf00d06b0765d2309ecbaaa1dca3f893c825bd6a4462c581d738eb26b9"} Apr 21 15:35:46.200003 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.199971 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6l9p8" event={"ID":"822fd92b-9cc2-44e7-972a-9b68cde8ab9a","Type":"ContainerStarted","Data":"454949cc61118d3e1bf614e7c2ac3beaf58ea04bea38a425cee7e142196fd692"} Apr 21 15:35:46.202058 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.202043 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:35:46.204621 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.204325 2569 generic.go:358] "Generic (PLEG): container finished" podID="7b0083d8-b152-40fa-9a89-e3180ed1747d" containerID="ac37da7ec048be3971ec29a86a8a82b767eb4ff5196a641e2ac17d9701156fb6" exitCode=1 Apr 21 15:35:46.204621 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.204420 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" event={"ID":"7b0083d8-b152-40fa-9a89-e3180ed1747d","Type":"ContainerStarted","Data":"76000c440ec0df855ed3b95130a74ac6c3d41186b14427f6cc4dac71c5ef4a4e"} Apr 21 15:35:46.204621 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.204453 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" event={"ID":"7b0083d8-b152-40fa-9a89-e3180ed1747d","Type":"ContainerStarted","Data":"26717caa9059dab0a4ea7d2b94b393005b61debed1dc7bdc190b5c475e5acc2e"} Apr 21 15:35:46.204621 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.204469 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" event={"ID":"7b0083d8-b152-40fa-9a89-e3180ed1747d","Type":"ContainerStarted","Data":"d183e56699f6590f6ba4aafe6c86ba74072a9f2845cefda416a650f928cca8c8"} Apr 21 15:35:46.204621 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.204541 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" event={"ID":"7b0083d8-b152-40fa-9a89-e3180ed1747d","Type":"ContainerStarted","Data":"9f7a982ef912f92d2b8d88b54d65ba6f034cba50aa5f7f1e866ce639aaa7c132"} Apr 21 15:35:46.204621 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.204557 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" event={"ID":"7b0083d8-b152-40fa-9a89-e3180ed1747d","Type":"ContainerDied","Data":"ac37da7ec048be3971ec29a86a8a82b767eb4ff5196a641e2ac17d9701156fb6"} Apr 21 15:35:46.204621 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.204581 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" event={"ID":"7b0083d8-b152-40fa-9a89-e3180ed1747d","Type":"ContainerStarted","Data":"c49c9bf1360874aa5d6a2f0605cac651117a5fb4dcfbd4943884e2fcd0e40bc2"} Apr 21 15:35:46.208653 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.208632 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" event={"ID":"125effdd-7036-4d2f-ae23-d0516355b243","Type":"ContainerStarted","Data":"f5343d76e877c99147b9786e6265c21b2f8856613ef2673571ca7c5df3592613"} Apr 21 15:35:46.209918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.209891 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9b9sq" event={"ID":"50b93a7e-ace3-4153-b2d3-ea527a654b34","Type":"ContainerStarted","Data":"1dd426e78dcfbd620717da675ee9bceece76ee8d77361fad5822644ea8ba273b"} Apr 21 15:35:46.211436 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.211412 2569 generic.go:358] "Generic (PLEG): container finished" podID="402ecfa4-798f-4e6f-9d15-5c6ef953439a" containerID="4495aa06c3166f486f15d6567d56fd45b3c68d50d2a59ac0ab137ced87b40435" exitCode=0 Apr 21 15:35:46.211517 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.211454 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwpr7" event={"ID":"402ecfa4-798f-4e6f-9d15-5c6ef953439a","Type":"ContainerDied","Data":"4495aa06c3166f486f15d6567d56fd45b3c68d50d2a59ac0ab137ced87b40435"} Apr 21 15:35:46.244378 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.244319 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-m6tzn" podStartSLOduration=3.644180688 podStartE2EDuration="21.244300826s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:35:27.324920855 +0000 UTC m=+2.896953783" lastFinishedPulling="2026-04-21 15:35:44.925040985 +0000 UTC m=+20.497073921" observedRunningTime="2026-04-21 15:35:46.228623361 +0000 UTC m=+21.800656317" watchObservedRunningTime="2026-04-21 15:35:46.244300826 +0000 UTC m=+21.816333770" Apr 21 15:35:46.264220 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.264177 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9b9sq" podStartSLOduration=3.6137814820000003 podStartE2EDuration="21.264162237s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:35:27.324993061 +0000 UTC m=+2.897025997" lastFinishedPulling="2026-04-21 15:35:44.975373828 +0000 UTC m=+20.547406752" observedRunningTime="2026-04-21 15:35:46.243883516 +0000 UTC m=+21.815916461" watchObservedRunningTime="2026-04-21 15:35:46.264162237 +0000 UTC m=+21.836195180" Apr 21 15:35:46.264355 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.264271 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jddzn" podStartSLOduration=3.6456351270000003 podStartE2EDuration="21.264264964s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:35:27.333668896 +0000 UTC m=+2.905701831" lastFinishedPulling="2026-04-21 15:35:44.952298747 +0000 UTC m=+20.524331668" observedRunningTime="2026-04-21 15:35:46.264225931 +0000 UTC m=+21.836258877" watchObservedRunningTime="2026-04-21 15:35:46.264264964 +0000 UTC m=+21.836297908" Apr 21 15:35:46.281959 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.281920 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cx47v" podStartSLOduration=3.327159173 podStartE2EDuration="21.281907861s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:35:27.334803396 +0000 UTC m=+2.906836322" lastFinishedPulling="2026-04-21 15:35:45.289552089 +0000 UTC m=+20.861585010" observedRunningTime="2026-04-21 15:35:46.281337735 +0000 UTC m=+21.853370679" watchObservedRunningTime="2026-04-21 15:35:46.281907861 +0000 UTC m=+21.853940827" Apr 21 15:35:46.297405 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.297369 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6l9p8" podStartSLOduration=3.655769007 podStartE2EDuration="21.297355821s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:35:27.330828924 +0000 UTC m=+2.902861859" lastFinishedPulling="2026-04-21 15:35:44.972415745 +0000 UTC m=+20.544448673" observedRunningTime="2026-04-21 15:35:46.296943985 +0000 UTC m=+21.868976930" watchObservedRunningTime="2026-04-21 15:35:46.297355821 +0000 UTC m=+21.869388761" Apr 21 15:35:46.323538 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.323509 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:46.324168 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.324145 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:46.819528 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.819503 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 15:35:46.983669 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.983498 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T15:35:46.819524058Z","UUID":"0dc51d07-993b-497b-a0d3-d015924cafcd","Handler":null,"Name":"","Endpoint":""} Apr 21 15:35:46.985676 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.985648 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 15:35:46.985823 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:46.985686 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 15:35:47.215194 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:47.215160 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" event={"ID":"0a0b3ad87f77adee4d1e995d037c7e36","Type":"ContainerStarted","Data":"3ae3c08dd291d38e278eb23427ddbf20d887e92d62f784f82f8239b903db5087"} Apr 21 15:35:47.216637 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:47.216593 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-84tg6" event={"ID":"62ea1e4f-5379-4cf2-88a7-8b037ce9dd1d","Type":"ContainerStarted","Data":"b9c34e39dfae209469ee978196bd0c30d3de15265a24ad7c272de9e54101f3b8"} Apr 21 15:35:47.218329 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:47.218285 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" event={"ID":"125effdd-7036-4d2f-ae23-d0516355b243","Type":"ContainerStarted","Data":"4c8c32465f86dd7bcbf83bf94afc5685f92ef9aafb01f95cb17eb46041f7a686"} Apr 21 15:35:47.219172 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:47.219151 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:47.219919 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:47.219893 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6l9p8" Apr 21 15:35:47.235984 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:47.235888 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-232.ec2.internal" podStartSLOduration=21.235868634 podStartE2EDuration="21.235868634s" podCreationTimestamp="2026-04-21 15:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:47.234907015 +0000 UTC m=+22.806939984" watchObservedRunningTime="2026-04-21 15:35:47.235868634 +0000 UTC m=+22.807901583" Apr 21 15:35:47.262352 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:47.262298 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-84tg6" podStartSLOduration=4.642664394 podStartE2EDuration="22.262285449s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:35:27.332702731 +0000 UTC m=+2.904735667" lastFinishedPulling="2026-04-21 15:35:44.952323785 +0000 UTC m=+20.524356722" observedRunningTime="2026-04-21 15:35:47.262163026 +0000 UTC m=+22.834195970" watchObservedRunningTime="2026-04-21 15:35:47.262285449 +0000 UTC m=+22.834318391" Apr 21 15:35:48.049960 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:48.049866 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:48.049960 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:48.049940 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:48.050191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:48.049976 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:48.050191 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:48.050085 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:48.050283 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:48.050261 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:48.050364 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:48.050344 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:48.222817 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:48.222789 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:35:48.223220 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:48.223144 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" event={"ID":"7b0083d8-b152-40fa-9a89-e3180ed1747d","Type":"ContainerStarted","Data":"52771afc025646de870d501fccd1777bfb4659fadef2c5881a2d7ed9d4ad5aa1"} Apr 21 15:35:48.225106 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:48.225076 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" event={"ID":"125effdd-7036-4d2f-ae23-d0516355b243","Type":"ContainerStarted","Data":"c8011b39f32d284c013296cc0df02e7896c5e2fa78666509aa8cd43bd6c17ba6"} Apr 21 15:35:50.049597 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:50.049513 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:50.050141 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:50.049513 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:50.050141 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:50.049631 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:50.050141 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:50.049513 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:50.050141 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:50.049710 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:50.050141 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:50.049801 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:51.232578 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:51.232540 2569 generic.go:358] "Generic (PLEG): container finished" podID="402ecfa4-798f-4e6f-9d15-5c6ef953439a" containerID="eb9b09150846c33ff7e433f6188f829d5f4677b1b3c78b55e85bb0aa778b9134" exitCode=0 Apr 21 15:35:51.233506 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:51.232626 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwpr7" event={"ID":"402ecfa4-798f-4e6f-9d15-5c6ef953439a","Type":"ContainerDied","Data":"eb9b09150846c33ff7e433f6188f829d5f4677b1b3c78b55e85bb0aa778b9134"} Apr 21 15:35:51.235607 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:51.235548 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:35:51.235877 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:51.235853 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" event={"ID":"7b0083d8-b152-40fa-9a89-e3180ed1747d","Type":"ContainerStarted","Data":"0468e7ca600c60f5d7138f347c5bf96fd1cb33e580bcd3a0d223451b639af2fd"} Apr 21 15:35:51.236251 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:51.236233 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:51.236342 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:51.236259 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:51.236442 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:51.236418 2569 scope.go:117] "RemoveContainer" containerID="ac37da7ec048be3971ec29a86a8a82b767eb4ff5196a641e2ac17d9701156fb6" Apr 21 15:35:51.251704 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:51.251685 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:51.264053 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:51.264001 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkgzg" podStartSLOduration=5.880429561 podStartE2EDuration="26.263985093s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:35:27.329721903 +0000 UTC m=+2.901754827" lastFinishedPulling="2026-04-21 15:35:47.713277435 +0000 UTC m=+23.285310359" observedRunningTime="2026-04-21 15:35:48.252642318 +0000 UTC m=+23.824675275" watchObservedRunningTime="2026-04-21 15:35:51.263985093 +0000 UTC m=+26.836018030" Apr 21 15:35:51.917881 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:51.917621 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:51.918035 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:51.917782 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:51.918035 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:51.917988 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret podName:324dbbf6-3ab0-424a-a580-24aa71c13cb6 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:07.917965688 +0000 UTC m=+43.489998608 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret") pod "global-pull-secret-syncer-4mz5z" (UID: "324dbbf6-3ab0-424a-a580-24aa71c13cb6") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:52.049998 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.049959 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:52.050157 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.050100 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:52.050232 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:52.050187 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:52.050232 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.050094 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:52.050232 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:52.050079 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:52.050323 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:52.050282 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:52.191500 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.191461 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4mz5z"] Apr 21 15:35:52.195577 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.195545 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ct78s"] Apr 21 15:35:52.196067 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.196042 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lp64c"] Apr 21 15:35:52.241000 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.240976 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:35:52.241385 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.241348 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" event={"ID":"7b0083d8-b152-40fa-9a89-e3180ed1747d","Type":"ContainerStarted","Data":"db76a0d286da62b4dc3b5b108c24cf5dc2c1f75f8037e1d788b2d3f77bf03309"} Apr 21 15:35:52.241621 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.241588 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:52.246160 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.246133 2569 generic.go:358] "Generic (PLEG): container finished" podID="402ecfa4-798f-4e6f-9d15-5c6ef953439a" containerID="e924a89b8a3595bc4b7c33ab56ecdbd8c69d9fb1304135efc26bc5cd38307c3e" exitCode=0 Apr 21 15:35:52.246285 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.246229 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:52.246285 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.246231 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwpr7" event={"ID":"402ecfa4-798f-4e6f-9d15-5c6ef953439a","Type":"ContainerDied","Data":"e924a89b8a3595bc4b7c33ab56ecdbd8c69d9fb1304135efc26bc5cd38307c3e"} Apr 21 15:35:52.246408 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.246396 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:52.246463 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:52.246398 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:52.246537 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:52.246495 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:52.246537 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.246508 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:52.246647 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:52.246608 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:52.257992 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.257968 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:35:52.279893 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:52.279835 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" podStartSLOduration=9.332487405 podStartE2EDuration="27.279821962s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:35:27.328790865 +0000 UTC m=+2.900823800" lastFinishedPulling="2026-04-21 15:35:45.276125421 +0000 UTC m=+20.848158357" observedRunningTime="2026-04-21 15:35:52.278213637 +0000 UTC m=+27.850246579" watchObservedRunningTime="2026-04-21 15:35:52.279821962 +0000 UTC m=+27.851854904" Apr 21 15:35:53.249940 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:53.249910 2569 generic.go:358] "Generic (PLEG): container finished" podID="402ecfa4-798f-4e6f-9d15-5c6ef953439a" containerID="99c8adfbb2707d9cf8461ed49a8edf98be3e1f3d59dcece695c097afad3b22a7" exitCode=0 Apr 21 15:35:53.250372 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:53.249988 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwpr7" event={"ID":"402ecfa4-798f-4e6f-9d15-5c6ef953439a","Type":"ContainerDied","Data":"99c8adfbb2707d9cf8461ed49a8edf98be3e1f3d59dcece695c097afad3b22a7"} Apr 21 15:35:54.049997 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:54.049968 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:54.049997 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:54.049992 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:54.050199 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:54.050006 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:54.050199 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:54.050121 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:54.050306 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:54.050207 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:54.050355 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:54.050334 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:56.049606 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:56.049574 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:56.050026 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:56.049602 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:56.050026 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:56.049679 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:56.050026 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:56.049775 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:56.050026 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:56.049574 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:56.050026 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:56.049892 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:58.050552 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.050315 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:58.050980 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.050316 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:58.050980 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.050676 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:35:58.050980 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.050343 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:35:58.050980 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.050745 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ct78s" podUID="bbf400a9-66da-48f5-ba51-6ecd75c50fa2" Apr 21 15:35:58.050980 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.050829 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4mz5z" podUID="324dbbf6-3ab0-424a-a580-24aa71c13cb6" Apr 21 15:35:58.285677 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.285505 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-232.ec2.internal" event="NodeReady" Apr 21 15:35:58.285937 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.285683 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 15:35:58.339612 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.339580 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gdwg8"] Apr 21 15:35:58.384647 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.384607 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jpjds"] Apr 21 15:35:58.384814 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.384786 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.388740 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.388718 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 15:35:58.388740 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.388718 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7ld2z\"" Apr 21 15:35:58.388929 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.388754 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 15:35:58.398108 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.398079 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gdwg8"] Apr 21 15:35:58.398235 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.398112 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jpjds"] Apr 21 15:35:58.398299 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.398236 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:35:58.402454 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.402430 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 15:35:58.402619 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.402464 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xkwb5\"" Apr 21 15:35:58.402806 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.402790 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 15:35:58.402882 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.402789 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 15:35:58.466551 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.466515 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb43fce4-df4e-4cca-a455-90d323512faf-tmp-dir\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.466737 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.466585 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb43fce4-df4e-4cca-a455-90d323512faf-config-volume\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.466737 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.466652 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ksfb\" (UniqueName: \"kubernetes.io/projected/cb43fce4-df4e-4cca-a455-90d323512faf-kube-api-access-7ksfb\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.466737 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.466694 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vvq5\" (UniqueName: \"kubernetes.io/projected/fffa5175-92d5-48ec-a153-baf3f061b044-kube-api-access-7vvq5\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:35:58.466737 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.466717 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.466921 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.466742 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:35:58.568051 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.567966 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:35:58.568051 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.568026 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb43fce4-df4e-4cca-a455-90d323512faf-tmp-dir\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.568277 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.568075 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb43fce4-df4e-4cca-a455-90d323512faf-config-volume\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.568277 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.568124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ksfb\" (UniqueName: \"kubernetes.io/projected/cb43fce4-df4e-4cca-a455-90d323512faf-kube-api-access-7ksfb\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.568277 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.568138 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:58.568277 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.568158 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vvq5\" (UniqueName: \"kubernetes.io/projected/fffa5175-92d5-48ec-a153-baf3f061b044-kube-api-access-7vvq5\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:35:58.568277 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.568184 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.568277 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.568210 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert podName:fffa5175-92d5-48ec-a153-baf3f061b044 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:59.068187858 +0000 UTC m=+34.640220793 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert") pod "ingress-canary-jpjds" (UID: "fffa5175-92d5-48ec-a153-baf3f061b044") : secret "canary-serving-cert" not found Apr 21 15:35:58.568277 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.568267 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:58.568700 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.568316 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls podName:cb43fce4-df4e-4cca-a455-90d323512faf nodeName:}" failed. No retries permitted until 2026-04-21 15:35:59.068300081 +0000 UTC m=+34.640333002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls") pod "dns-default-gdwg8" (UID: "cb43fce4-df4e-4cca-a455-90d323512faf") : secret "dns-default-metrics-tls" not found Apr 21 15:35:58.568700 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.568473 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb43fce4-df4e-4cca-a455-90d323512faf-tmp-dir\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.568794 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.568745 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb43fce4-df4e-4cca-a455-90d323512faf-config-volume\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.582044 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.582013 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ksfb\" (UniqueName: \"kubernetes.io/projected/cb43fce4-df4e-4cca-a455-90d323512faf-kube-api-access-7ksfb\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:58.582206 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.582160 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vvq5\" (UniqueName: \"kubernetes.io/projected/fffa5175-92d5-48ec-a153-baf3f061b044-kube-api-access-7vvq5\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:35:58.669096 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.669056 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:35:58.669278 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.669223 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:58.669330 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.669300 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs podName:893ee07d-ac5e-4593-93fd-80655b690072 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:30.669280933 +0000 UTC m=+66.241313853 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs") pod "network-metrics-daemon-lp64c" (UID: "893ee07d-ac5e-4593-93fd-80655b690072") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:58.769850 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:58.769814 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jzd\" (UniqueName: \"kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd\") pod \"network-check-target-ct78s\" (UID: \"bbf400a9-66da-48f5-ba51-6ecd75c50fa2\") " pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:35:58.770018 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.769975 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:58.770018 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.769997 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:58.770018 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.770006 2569 projected.go:194] Error preparing data for projected volume kube-api-access-v2jzd for pod openshift-network-diagnostics/network-check-target-ct78s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:58.770126 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:58.770054 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd podName:bbf400a9-66da-48f5-ba51-6ecd75c50fa2 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:30.77004111 +0000 UTC m=+66.342074030 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2jzd" (UniqueName: "kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd") pod "network-check-target-ct78s" (UID: "bbf400a9-66da-48f5-ba51-6ecd75c50fa2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:59.071089 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:59.071049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:35:59.071089 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:35:59.071093 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:35:59.071648 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:59.071200 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:59.071648 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:59.071210 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:59.071648 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:59.071251 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert podName:fffa5175-92d5-48ec-a153-baf3f061b044 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:00.071236494 +0000 UTC m=+35.643269416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert") pod "ingress-canary-jpjds" (UID: "fffa5175-92d5-48ec-a153-baf3f061b044") : secret "canary-serving-cert" not found Apr 21 15:35:59.071648 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:35:59.071278 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls podName:cb43fce4-df4e-4cca-a455-90d323512faf nodeName:}" failed. No retries permitted until 2026-04-21 15:36:00.071258481 +0000 UTC m=+35.643291405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls") pod "dns-default-gdwg8" (UID: "cb43fce4-df4e-4cca-a455-90d323512faf") : secret "dns-default-metrics-tls" not found Apr 21 15:36:00.050119 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.050073 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:36:00.050267 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.050073 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:36:00.050382 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.050082 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:36:00.055260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.055003 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ndcm7\"" Apr 21 15:36:00.055260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.055024 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:36:00.055260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.055073 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:36:00.055260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.055088 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 15:36:00.055260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.055120 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjbkh\"" Apr 21 15:36:00.055260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.055024 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:36:00.077473 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.077451 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:36:00.077473 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.077493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:36:00.078017 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:00.077578 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:00.078017 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:00.077581 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:00.078017 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:00.077626 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert podName:fffa5175-92d5-48ec-a153-baf3f061b044 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:02.077612496 +0000 UTC m=+37.649645416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert") pod "ingress-canary-jpjds" (UID: "fffa5175-92d5-48ec-a153-baf3f061b044") : secret "canary-serving-cert" not found Apr 21 15:36:00.078017 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:00.077637 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls podName:cb43fce4-df4e-4cca-a455-90d323512faf nodeName:}" failed. No retries permitted until 2026-04-21 15:36:02.077631547 +0000 UTC m=+37.649664468 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls") pod "dns-default-gdwg8" (UID: "cb43fce4-df4e-4cca-a455-90d323512faf") : secret "dns-default-metrics-tls" not found Apr 21 15:36:00.266525 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.266475 2569 generic.go:358] "Generic (PLEG): container finished" podID="402ecfa4-798f-4e6f-9d15-5c6ef953439a" containerID="b57d773b5ee657f2d4272975861f5195c2d2cf7aefc306c5afc305b8e27400eb" exitCode=0 Apr 21 15:36:00.266700 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:00.266534 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwpr7" event={"ID":"402ecfa4-798f-4e6f-9d15-5c6ef953439a","Type":"ContainerDied","Data":"b57d773b5ee657f2d4272975861f5195c2d2cf7aefc306c5afc305b8e27400eb"} Apr 21 15:36:01.270981 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:01.270950 2569 generic.go:358] "Generic (PLEG): container finished" podID="402ecfa4-798f-4e6f-9d15-5c6ef953439a" containerID="d8a223fb7b690dc53f11c3508aa3fa99cb35f15cbd7fed643972d0dbdf2d565f" exitCode=0 Apr 21 15:36:01.271382 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:01.270997 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwpr7" event={"ID":"402ecfa4-798f-4e6f-9d15-5c6ef953439a","Type":"ContainerDied","Data":"d8a223fb7b690dc53f11c3508aa3fa99cb35f15cbd7fed643972d0dbdf2d565f"} Apr 21 15:36:02.093953 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:02.093858 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:36:02.093953 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:02.093913 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:36:02.094152 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:02.094017 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:02.094152 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:02.094052 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:02.094152 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:02.094108 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls podName:cb43fce4-df4e-4cca-a455-90d323512faf nodeName:}" failed. No retries permitted until 2026-04-21 15:36:06.094081474 +0000 UTC m=+41.666114396 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls") pod "dns-default-gdwg8" (UID: "cb43fce4-df4e-4cca-a455-90d323512faf") : secret "dns-default-metrics-tls" not found Apr 21 15:36:02.094152 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:02.094126 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert podName:fffa5175-92d5-48ec-a153-baf3f061b044 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:06.094117896 +0000 UTC m=+41.666150816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert") pod "ingress-canary-jpjds" (UID: "fffa5175-92d5-48ec-a153-baf3f061b044") : secret "canary-serving-cert" not found Apr 21 15:36:02.275822 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:02.275789 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwpr7" event={"ID":"402ecfa4-798f-4e6f-9d15-5c6ef953439a","Type":"ContainerStarted","Data":"b7ddb8a9729e19aeb80819e7c1efc80bee9c366d67449961b8ae0fbed4a9fe29"} Apr 21 15:36:02.300140 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:02.300093 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xwpr7" podStartSLOduration=5.399664733 podStartE2EDuration="37.300076703s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:35:27.321286868 +0000 UTC m=+2.893319804" lastFinishedPulling="2026-04-21 15:35:59.221698852 +0000 UTC m=+34.793731774" observedRunningTime="2026-04-21 15:36:02.298460758 +0000 UTC m=+37.870493702" watchObservedRunningTime="2026-04-21 15:36:02.300076703 +0000 UTC m=+37.872109645" Apr 21 15:36:06.122217 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:06.122021 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:36:06.122648 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:06.122225 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:36:06.122648 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:06.122177 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:06.122648 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:06.122330 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:06.122648 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:06.122338 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls podName:cb43fce4-df4e-4cca-a455-90d323512faf nodeName:}" failed. No retries permitted until 2026-04-21 15:36:14.122319874 +0000 UTC m=+49.694352794 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls") pod "dns-default-gdwg8" (UID: "cb43fce4-df4e-4cca-a455-90d323512faf") : secret "dns-default-metrics-tls" not found Apr 21 15:36:06.122648 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:06.122366 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert podName:fffa5175-92d5-48ec-a153-baf3f061b044 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:14.122355328 +0000 UTC m=+49.694388250 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert") pod "ingress-canary-jpjds" (UID: "fffa5175-92d5-48ec-a153-baf3f061b044") : secret "canary-serving-cert" not found Apr 21 15:36:07.934040 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:07.933998 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:36:07.936714 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:07.936687 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/324dbbf6-3ab0-424a-a580-24aa71c13cb6-original-pull-secret\") pod \"global-pull-secret-syncer-4mz5z\" (UID: \"324dbbf6-3ab0-424a-a580-24aa71c13cb6\") " pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:36:08.159963 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:08.159920 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4mz5z" Apr 21 15:36:08.332780 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:08.332742 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4mz5z"] Apr 21 15:36:08.336589 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:36:08.336555 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod324dbbf6_3ab0_424a_a580_24aa71c13cb6.slice/crio-a078c98e1ca023ba47cbbf2ee437542ed0d5142938b067d09cc243dde0dbd95a WatchSource:0}: Error finding container a078c98e1ca023ba47cbbf2ee437542ed0d5142938b067d09cc243dde0dbd95a: Status 404 returned error can't find the container with id a078c98e1ca023ba47cbbf2ee437542ed0d5142938b067d09cc243dde0dbd95a Apr 21 15:36:09.291231 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:09.291173 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4mz5z" event={"ID":"324dbbf6-3ab0-424a-a580-24aa71c13cb6","Type":"ContainerStarted","Data":"a078c98e1ca023ba47cbbf2ee437542ed0d5142938b067d09cc243dde0dbd95a"} Apr 21 15:36:13.300006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:13.299968 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4mz5z" event={"ID":"324dbbf6-3ab0-424a-a580-24aa71c13cb6","Type":"ContainerStarted","Data":"83d5498fd655dfe96303c9d986a7885a25f3c1e8b8a4c16a1115f9bf1ac2b6a1"} Apr 21 15:36:13.316095 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:13.316043 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4mz5z" podStartSLOduration=33.315862102 podStartE2EDuration="37.316030827s" podCreationTimestamp="2026-04-21 15:35:36 +0000 UTC" firstStartedPulling="2026-04-21 15:36:08.338382653 +0000 UTC m=+43.910415586" lastFinishedPulling="2026-04-21 15:36:12.338551388 +0000 UTC m=+47.910584311" observedRunningTime="2026-04-21 15:36:13.315671943 +0000 UTC m=+48.887704887" watchObservedRunningTime="2026-04-21 15:36:13.316030827 +0000 UTC m=+48.888063770" Apr 21 15:36:14.180794 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:14.180759 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:36:14.180794 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:14.180795 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:36:14.181025 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:14.180896 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:14.181025 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:14.180904 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:14.181025 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:14.180962 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert podName:fffa5175-92d5-48ec-a153-baf3f061b044 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:30.180945563 +0000 UTC m=+65.752978484 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert") pod "ingress-canary-jpjds" (UID: "fffa5175-92d5-48ec-a153-baf3f061b044") : secret "canary-serving-cert" not found Apr 21 15:36:14.181025 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:14.180980 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls podName:cb43fce4-df4e-4cca-a455-90d323512faf nodeName:}" failed. No retries permitted until 2026-04-21 15:36:30.180972071 +0000 UTC m=+65.753004995 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls") pod "dns-default-gdwg8" (UID: "cb43fce4-df4e-4cca-a455-90d323512faf") : secret "dns-default-metrics-tls" not found Apr 21 15:36:24.262364 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:24.262330 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wkxqg" Apr 21 15:36:30.193558 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:30.193519 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:36:30.193558 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:30.193558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:36:30.194179 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:30.193704 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:30.194179 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:30.193735 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:30.194179 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:30.193799 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls podName:cb43fce4-df4e-4cca-a455-90d323512faf nodeName:}" failed. No retries permitted until 2026-04-21 15:37:02.193775277 +0000 UTC m=+97.765808202 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls") pod "dns-default-gdwg8" (UID: "cb43fce4-df4e-4cca-a455-90d323512faf") : secret "dns-default-metrics-tls" not found Apr 21 15:36:30.194179 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:30.193820 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert podName:fffa5175-92d5-48ec-a153-baf3f061b044 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:02.193810512 +0000 UTC m=+97.765843440 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert") pod "ingress-canary-jpjds" (UID: "fffa5175-92d5-48ec-a153-baf3f061b044") : secret "canary-serving-cert" not found Apr 21 15:36:30.697148 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:30.697105 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:36:30.702145 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:30.702118 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:36:30.707712 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:30.707687 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:36:30.707814 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:36:30.707766 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs podName:893ee07d-ac5e-4593-93fd-80655b690072 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:34.707742872 +0000 UTC m=+130.279775794 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs") pod "network-metrics-daemon-lp64c" (UID: "893ee07d-ac5e-4593-93fd-80655b690072") : secret "metrics-daemon-secret" not found Apr 21 15:36:30.797408 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:30.797374 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jzd\" (UniqueName: \"kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd\") pod \"network-check-target-ct78s\" (UID: \"bbf400a9-66da-48f5-ba51-6ecd75c50fa2\") " pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:36:30.800645 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:30.800624 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:36:30.811496 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:30.811456 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:36:30.821408 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:30.821378 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jzd\" (UniqueName: \"kubernetes.io/projected/bbf400a9-66da-48f5-ba51-6ecd75c50fa2-kube-api-access-v2jzd\") pod \"network-check-target-ct78s\" (UID: \"bbf400a9-66da-48f5-ba51-6ecd75c50fa2\") " pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:36:30.968819 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:30.968732 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ndcm7\"" Apr 21 15:36:30.976270 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:30.976252 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:36:31.118633 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:31.118601 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ct78s"] Apr 21 15:36:31.123588 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:36:31.123561 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbf400a9_66da_48f5_ba51_6ecd75c50fa2.slice/crio-99dd60ad4d2afb1e740ec81f4b99e6b47d83d1820dd64568d30da427d25ef977 WatchSource:0}: Error finding container 99dd60ad4d2afb1e740ec81f4b99e6b47d83d1820dd64568d30da427d25ef977: Status 404 returned error can't find the container with id 99dd60ad4d2afb1e740ec81f4b99e6b47d83d1820dd64568d30da427d25ef977 Apr 21 15:36:31.335215 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:31.335130 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ct78s" event={"ID":"bbf400a9-66da-48f5-ba51-6ecd75c50fa2","Type":"ContainerStarted","Data":"99dd60ad4d2afb1e740ec81f4b99e6b47d83d1820dd64568d30da427d25ef977"} Apr 21 15:36:34.342855 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:34.342821 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ct78s" event={"ID":"bbf400a9-66da-48f5-ba51-6ecd75c50fa2","Type":"ContainerStarted","Data":"cb4da228cb3ddae0510c9aee41dca0933ab418186018cac7d53349054b27e270"} Apr 21 15:36:34.343207 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:36:34.342941 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:37:02.215177 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:02.215131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:37:02.215177 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:02.215181 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:37:02.215665 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:02.215271 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:37:02.215665 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:02.215275 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:37:02.215665 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:02.215333 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert podName:fffa5175-92d5-48ec-a153-baf3f061b044 nodeName:}" failed. No retries permitted until 2026-04-21 15:38:06.215319048 +0000 UTC m=+161.787351969 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert") pod "ingress-canary-jpjds" (UID: "fffa5175-92d5-48ec-a153-baf3f061b044") : secret "canary-serving-cert" not found Apr 21 15:37:02.215665 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:02.215346 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls podName:cb43fce4-df4e-4cca-a455-90d323512faf nodeName:}" failed. No retries permitted until 2026-04-21 15:38:06.215340197 +0000 UTC m=+161.787373119 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls") pod "dns-default-gdwg8" (UID: "cb43fce4-df4e-4cca-a455-90d323512faf") : secret "dns-default-metrics-tls" not found Apr 21 15:37:05.347511 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:05.347452 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ct78s" Apr 21 15:37:05.374590 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:05.374545 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ct78s" podStartSLOduration=97.715782654 podStartE2EDuration="1m40.374530629s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:36:31.125335116 +0000 UTC m=+66.697368050" lastFinishedPulling="2026-04-21 15:36:33.784083104 +0000 UTC m=+69.356116025" observedRunningTime="2026-04-21 15:36:34.375945053 +0000 UTC m=+69.947978030" watchObservedRunningTime="2026-04-21 15:37:05.374530629 +0000 UTC m=+100.946563573" Apr 21 15:37:18.346121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.346084 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mbdj9"] Apr 21 15:37:18.347773 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.347758 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-744b6c97cd-ktnzw"] Apr 21 15:37:18.347914 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.347897 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.349539 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.349514 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.351838 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.351806 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 15:37:18.352012 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.351997 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 15:37:18.352095 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.352016 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 15:37:18.352095 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.352039 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-hnn5q\"" Apr 21 15:37:18.352095 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.352070 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 15:37:18.352238 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.352094 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 15:37:18.352335 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.352320 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 15:37:18.352857 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.352841 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 15:37:18.352857 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.352851 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-jgbz7\"" Apr 21 15:37:18.353097 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.352851 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 15:37:18.353177 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.353162 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 15:37:18.353436 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.353417 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 15:37:18.359337 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.359321 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 15:37:18.363991 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.363965 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mbdj9"] Apr 21 15:37:18.365369 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.365348 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-744b6c97cd-ktnzw"] Apr 21 15:37:18.427594 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.427560 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv478\" (UniqueName: \"kubernetes.io/projected/e601bc85-e6ec-4e42-8728-90ec8be4699c-kube-api-access-nv478\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.427594 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.427593 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b8e8467-e89b-4cd1-b772-d34e71416962-tmp\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.427848 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.427620 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk29m\" (UniqueName: \"kubernetes.io/projected/9b8e8467-e89b-4cd1-b772-d34e71416962-kube-api-access-pk29m\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.427848 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.427678 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-default-certificate\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.427848 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.427698 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9b8e8467-e89b-4cd1-b772-d34e71416962-snapshots\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.427848 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.427777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.427848 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.427806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8e8467-e89b-4cd1-b772-d34e71416962-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.428030 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.427863 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8e8467-e89b-4cd1-b772-d34e71416962-service-ca-bundle\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.428030 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.427899 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.428030 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.427914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8e8467-e89b-4cd1-b772-d34e71416962-serving-cert\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.428030 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.427934 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-stats-auth\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.528458 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.528416 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk29m\" (UniqueName: \"kubernetes.io/projected/9b8e8467-e89b-4cd1-b772-d34e71416962-kube-api-access-pk29m\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.528646 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.528608 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-default-certificate\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.528646 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.528639 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9b8e8467-e89b-4cd1-b772-d34e71416962-snapshots\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.528759 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.528676 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.528759 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.528706 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8e8467-e89b-4cd1-b772-d34e71416962-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.528759 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.528746 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8e8467-e89b-4cd1-b772-d34e71416962-service-ca-bundle\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.528910 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:18.528785 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 15:37:18.528910 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.528792 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.528910 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.528818 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8e8467-e89b-4cd1-b772-d34e71416962-serving-cert\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.528910 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:18.528862 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs podName:e601bc85-e6ec-4e42-8728-90ec8be4699c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:19.028839538 +0000 UTC m=+114.600872466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs") pod "router-default-744b6c97cd-ktnzw" (UID: "e601bc85-e6ec-4e42-8728-90ec8be4699c") : secret "router-metrics-certs-default" not found Apr 21 15:37:18.528910 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.528887 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-stats-auth\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.529210 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.528935 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv478\" (UniqueName: \"kubernetes.io/projected/e601bc85-e6ec-4e42-8728-90ec8be4699c-kube-api-access-nv478\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.529210 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:18.528955 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle podName:e601bc85-e6ec-4e42-8728-90ec8be4699c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:19.028936903 +0000 UTC m=+114.600969848 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle") pod "router-default-744b6c97cd-ktnzw" (UID: "e601bc85-e6ec-4e42-8728-90ec8be4699c") : configmap references non-existent config key: service-ca.crt Apr 21 15:37:18.529210 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.528988 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b8e8467-e89b-4cd1-b772-d34e71416962-tmp\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.529446 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.529377 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9b8e8467-e89b-4cd1-b772-d34e71416962-snapshots\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.529446 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.529436 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8e8467-e89b-4cd1-b772-d34e71416962-service-ca-bundle\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.529609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.529556 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b8e8467-e89b-4cd1-b772-d34e71416962-tmp\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.529654 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.529613 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8e8467-e89b-4cd1-b772-d34e71416962-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.531371 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.531354 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-stats-auth\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.531609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.531591 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8e8467-e89b-4cd1-b772-d34e71416962-serving-cert\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.531650 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.531630 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-default-certificate\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.542516 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.542470 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv478\" (UniqueName: \"kubernetes.io/projected/e601bc85-e6ec-4e42-8728-90ec8be4699c-kube-api-access-nv478\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:18.543182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.543157 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk29m\" (UniqueName: \"kubernetes.io/projected/9b8e8467-e89b-4cd1-b772-d34e71416962-kube-api-access-pk29m\") pod \"insights-operator-585dfdc468-mbdj9\" (UID: \"9b8e8467-e89b-4cd1-b772-d34e71416962\") " pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.614197 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.614109 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9nzlb"] Apr 21 15:37:18.616515 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.616474 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nzlb" Apr 21 15:37:18.620024 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.620003 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-zftb4\"" Apr 21 15:37:18.630381 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.630359 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9nzlb"] Apr 21 15:37:18.658284 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.658250 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-mbdj9" Apr 21 15:37:18.723614 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.723500 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb"] Apr 21 15:37:18.727303 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.727101 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns"] Apr 21 15:37:18.729628 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.729603 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj"] Apr 21 15:37:18.729814 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.729759 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" Apr 21 15:37:18.729892 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.729827 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:18.730558 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.730022 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsr5l\" (UniqueName: \"kubernetes.io/projected/c18e8ca8-7b38-4eb1-8ec2-a2817736c6c4-kube-api-access-lsr5l\") pod \"network-check-source-8894fc9bd-9nzlb\" (UID: \"c18e8ca8-7b38-4eb1-8ec2-a2817736c6c4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nzlb" Apr 21 15:37:18.731942 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.731738 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" Apr 21 15:37:18.739569 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.739459 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 15:37:18.741892 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.741572 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 15:37:18.741892 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.741754 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 15:37:18.741892 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.741854 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns"] Apr 21 15:37:18.742101 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.741913 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj"] Apr 21 15:37:18.742591 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.742569 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-9ztdz\"" Apr 21 15:37:18.742776 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.742717 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:37:18.742995 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.742674 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:37:18.743091 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.743045 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 15:37:18.743183 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.743165 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb"] Apr 21 15:37:18.743258 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.743244 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 15:37:18.743308 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.743258 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-22xm4\"" Apr 21 15:37:18.744199 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.744183 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:37:18.744400 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.744382 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 15:37:18.744703 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.744687 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 15:37:18.746858 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.746839 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-5pskt\"" Apr 21 15:37:18.747243 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.746920 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 15:37:18.794005 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.793979 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mbdj9"] Apr 21 15:37:18.797398 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:37:18.797372 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8e8467_e89b_4cd1_b772_d34e71416962.slice/crio-23f4b73a254bc72d4439a36260e3f8c6c3d383b480fcee9da26d5bb608a80718 WatchSource:0}: Error finding container 23f4b73a254bc72d4439a36260e3f8c6c3d383b480fcee9da26d5bb608a80718: Status 404 returned error can't find the container with id 23f4b73a254bc72d4439a36260e3f8c6c3d383b480fcee9da26d5bb608a80718 Apr 21 15:37:18.830405 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.830365 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv7rp\" (UniqueName: \"kubernetes.io/projected/f4e9ee47-7719-418f-90bf-ada2af6eab08-kube-api-access-bv7rp\") pod \"kube-storage-version-migrator-operator-6769c5d45-qdkns\" (UID: \"f4e9ee47-7719-418f-90bf-ada2af6eab08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" Apr 21 15:37:18.830405 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.830409 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsr5l\" (UniqueName: \"kubernetes.io/projected/c18e8ca8-7b38-4eb1-8ec2-a2817736c6c4-kube-api-access-lsr5l\") pod \"network-check-source-8894fc9bd-9nzlb\" (UID: \"c18e8ca8-7b38-4eb1-8ec2-a2817736c6c4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nzlb" Apr 21 15:37:18.830636 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.830430 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a049f2-3e84-402e-91d3-5911909aa995-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dhpnj\" (UID: \"53a049f2-3e84-402e-91d3-5911909aa995\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" Apr 21 15:37:18.830636 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.830504 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4e9ee47-7719-418f-90bf-ada2af6eab08-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qdkns\" (UID: \"f4e9ee47-7719-418f-90bf-ada2af6eab08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" Apr 21 15:37:18.830636 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.830523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a049f2-3e84-402e-91d3-5911909aa995-config\") pod \"service-ca-operator-d6fc45fc5-dhpnj\" (UID: \"53a049f2-3e84-402e-91d3-5911909aa995\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" Apr 21 15:37:18.830636 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.830554 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gj5kb\" (UID: \"3beca87f-1e7f-40a9-bd98-faa9ab860ad7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:18.830636 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.830571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88k2h\" (UniqueName: \"kubernetes.io/projected/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-kube-api-access-88k2h\") pod \"cluster-samples-operator-6dc5bdb6b4-gj5kb\" (UID: \"3beca87f-1e7f-40a9-bd98-faa9ab860ad7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:18.830636 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.830585 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m8ww\" (UniqueName: \"kubernetes.io/projected/53a049f2-3e84-402e-91d3-5911909aa995-kube-api-access-9m8ww\") pod \"service-ca-operator-d6fc45fc5-dhpnj\" (UID: \"53a049f2-3e84-402e-91d3-5911909aa995\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" Apr 21 15:37:18.830636 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.830609 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e9ee47-7719-418f-90bf-ada2af6eab08-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qdkns\" (UID: \"f4e9ee47-7719-418f-90bf-ada2af6eab08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" Apr 21 15:37:18.841331 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.841306 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsr5l\" (UniqueName: \"kubernetes.io/projected/c18e8ca8-7b38-4eb1-8ec2-a2817736c6c4-kube-api-access-lsr5l\") pod \"network-check-source-8894fc9bd-9nzlb\" (UID: \"c18e8ca8-7b38-4eb1-8ec2-a2817736c6c4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nzlb" Apr 21 15:37:18.924913 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.924878 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nzlb" Apr 21 15:37:18.931892 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.931865 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4e9ee47-7719-418f-90bf-ada2af6eab08-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qdkns\" (UID: \"f4e9ee47-7719-418f-90bf-ada2af6eab08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" Apr 21 15:37:18.931962 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.931907 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a049f2-3e84-402e-91d3-5911909aa995-config\") pod \"service-ca-operator-d6fc45fc5-dhpnj\" (UID: \"53a049f2-3e84-402e-91d3-5911909aa995\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" Apr 21 15:37:18.932079 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.932058 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gj5kb\" (UID: \"3beca87f-1e7f-40a9-bd98-faa9ab860ad7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:18.932125 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.932092 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88k2h\" (UniqueName: \"kubernetes.io/projected/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-kube-api-access-88k2h\") pod \"cluster-samples-operator-6dc5bdb6b4-gj5kb\" (UID: \"3beca87f-1e7f-40a9-bd98-faa9ab860ad7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:18.932125 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.932110 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9m8ww\" (UniqueName: \"kubernetes.io/projected/53a049f2-3e84-402e-91d3-5911909aa995-kube-api-access-9m8ww\") pod \"service-ca-operator-d6fc45fc5-dhpnj\" (UID: \"53a049f2-3e84-402e-91d3-5911909aa995\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" Apr 21 15:37:18.932225 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:18.932208 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:37:18.932279 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.932257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e9ee47-7719-418f-90bf-ada2af6eab08-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qdkns\" (UID: \"f4e9ee47-7719-418f-90bf-ada2af6eab08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" Apr 21 15:37:18.932330 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:18.932279 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls podName:3beca87f-1e7f-40a9-bd98-faa9ab860ad7 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:19.432259045 +0000 UTC m=+115.004291967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gj5kb" (UID: "3beca87f-1e7f-40a9-bd98-faa9ab860ad7") : secret "samples-operator-tls" not found Apr 21 15:37:18.932408 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.932391 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bv7rp\" (UniqueName: \"kubernetes.io/projected/f4e9ee47-7719-418f-90bf-ada2af6eab08-kube-api-access-bv7rp\") pod \"kube-storage-version-migrator-operator-6769c5d45-qdkns\" (UID: \"f4e9ee47-7719-418f-90bf-ada2af6eab08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" Apr 21 15:37:18.932464 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.932432 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a049f2-3e84-402e-91d3-5911909aa995-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dhpnj\" (UID: \"53a049f2-3e84-402e-91d3-5911909aa995\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" Apr 21 15:37:18.932839 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.932808 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e9ee47-7719-418f-90bf-ada2af6eab08-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qdkns\" (UID: \"f4e9ee47-7719-418f-90bf-ada2af6eab08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" Apr 21 15:37:18.933116 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.933094 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a049f2-3e84-402e-91d3-5911909aa995-config\") pod \"service-ca-operator-d6fc45fc5-dhpnj\" (UID: \"53a049f2-3e84-402e-91d3-5911909aa995\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" Apr 21 15:37:18.934176 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.934156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4e9ee47-7719-418f-90bf-ada2af6eab08-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qdkns\" (UID: \"f4e9ee47-7719-418f-90bf-ada2af6eab08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" Apr 21 15:37:18.934509 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.934471 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a049f2-3e84-402e-91d3-5911909aa995-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dhpnj\" (UID: \"53a049f2-3e84-402e-91d3-5911909aa995\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" Apr 21 15:37:18.945097 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.945064 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv7rp\" (UniqueName: \"kubernetes.io/projected/f4e9ee47-7719-418f-90bf-ada2af6eab08-kube-api-access-bv7rp\") pod \"kube-storage-version-migrator-operator-6769c5d45-qdkns\" (UID: \"f4e9ee47-7719-418f-90bf-ada2af6eab08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" Apr 21 15:37:18.945255 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.945237 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88k2h\" (UniqueName: \"kubernetes.io/projected/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-kube-api-access-88k2h\") pod \"cluster-samples-operator-6dc5bdb6b4-gj5kb\" (UID: \"3beca87f-1e7f-40a9-bd98-faa9ab860ad7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:18.945309 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:18.945288 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m8ww\" (UniqueName: \"kubernetes.io/projected/53a049f2-3e84-402e-91d3-5911909aa995-kube-api-access-9m8ww\") pod \"service-ca-operator-d6fc45fc5-dhpnj\" (UID: \"53a049f2-3e84-402e-91d3-5911909aa995\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" Apr 21 15:37:19.033730 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.033686 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:19.033909 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.033773 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:19.033909 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:19.033828 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 15:37:19.033909 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:19.033897 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs podName:e601bc85-e6ec-4e42-8728-90ec8be4699c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:20.033881151 +0000 UTC m=+115.605914072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs") pod "router-default-744b6c97cd-ktnzw" (UID: "e601bc85-e6ec-4e42-8728-90ec8be4699c") : secret "router-metrics-certs-default" not found Apr 21 15:37:19.034042 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:19.033914 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle podName:e601bc85-e6ec-4e42-8728-90ec8be4699c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:20.033904292 +0000 UTC m=+115.605937213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle") pod "router-default-744b6c97cd-ktnzw" (UID: "e601bc85-e6ec-4e42-8728-90ec8be4699c") : configmap references non-existent config key: service-ca.crt Apr 21 15:37:19.042150 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.042122 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9nzlb"] Apr 21 15:37:19.043968 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.043942 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" Apr 21 15:37:19.044283 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:37:19.044258 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc18e8ca8_7b38_4eb1_8ec2_a2817736c6c4.slice/crio-8ab44621df1d16e30bebeb207cd160157b54298ea02b96127ddea2831dfe15c9 WatchSource:0}: Error finding container 8ab44621df1d16e30bebeb207cd160157b54298ea02b96127ddea2831dfe15c9: Status 404 returned error can't find the container with id 8ab44621df1d16e30bebeb207cd160157b54298ea02b96127ddea2831dfe15c9 Apr 21 15:37:19.058308 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.058284 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" Apr 21 15:37:19.181830 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.181684 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns"] Apr 21 15:37:19.198962 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.198930 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj"] Apr 21 15:37:19.203095 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:37:19.203065 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53a049f2_3e84_402e_91d3_5911909aa995.slice/crio-a4f721cc43f34413048abf3964027120aa6d8703b974d25fed0036a521e5a3b1 WatchSource:0}: Error finding container a4f721cc43f34413048abf3964027120aa6d8703b974d25fed0036a521e5a3b1: Status 404 returned error can't find the container with id a4f721cc43f34413048abf3964027120aa6d8703b974d25fed0036a521e5a3b1 Apr 21 15:37:19.432173 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.432066 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" event={"ID":"f4e9ee47-7719-418f-90bf-ada2af6eab08","Type":"ContainerStarted","Data":"f70a70148568dbed7b60c270aa121553832d39be82ba3af2b87f6cbea9d58402"} Apr 21 15:37:19.433309 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.433264 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mbdj9" event={"ID":"9b8e8467-e89b-4cd1-b772-d34e71416962","Type":"ContainerStarted","Data":"23f4b73a254bc72d4439a36260e3f8c6c3d383b480fcee9da26d5bb608a80718"} Apr 21 15:37:19.434778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.434750 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nzlb" event={"ID":"c18e8ca8-7b38-4eb1-8ec2-a2817736c6c4","Type":"ContainerStarted","Data":"122fe97ed854c922cd8f6002a3747124fa6357d2acbced47d680dd7de53f9fb3"} Apr 21 15:37:19.434890 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.434785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nzlb" event={"ID":"c18e8ca8-7b38-4eb1-8ec2-a2817736c6c4","Type":"ContainerStarted","Data":"8ab44621df1d16e30bebeb207cd160157b54298ea02b96127ddea2831dfe15c9"} Apr 21 15:37:19.436044 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.436008 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" event={"ID":"53a049f2-3e84-402e-91d3-5911909aa995","Type":"ContainerStarted","Data":"a4f721cc43f34413048abf3964027120aa6d8703b974d25fed0036a521e5a3b1"} Apr 21 15:37:19.438691 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.438666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gj5kb\" (UID: \"3beca87f-1e7f-40a9-bd98-faa9ab860ad7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:19.438841 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:19.438820 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:37:19.438916 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:19.438893 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls podName:3beca87f-1e7f-40a9-bd98-faa9ab860ad7 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:20.438874175 +0000 UTC m=+116.010907102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gj5kb" (UID: "3beca87f-1e7f-40a9-bd98-faa9ab860ad7") : secret "samples-operator-tls" not found Apr 21 15:37:19.451622 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:19.451569 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nzlb" podStartSLOduration=1.451554344 podStartE2EDuration="1.451554344s" podCreationTimestamp="2026-04-21 15:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:37:19.450699034 +0000 UTC m=+115.022732003" watchObservedRunningTime="2026-04-21 15:37:19.451554344 +0000 UTC m=+115.023587280" Apr 21 15:37:20.045744 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:20.044961 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:20.045744 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:20.045061 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:20.045744 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:20.045221 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle podName:e601bc85-e6ec-4e42-8728-90ec8be4699c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:22.045200053 +0000 UTC m=+117.617232976 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle") pod "router-default-744b6c97cd-ktnzw" (UID: "e601bc85-e6ec-4e42-8728-90ec8be4699c") : configmap references non-existent config key: service-ca.crt Apr 21 15:37:20.045744 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:20.045656 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 15:37:20.045744 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:20.045710 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs podName:e601bc85-e6ec-4e42-8728-90ec8be4699c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:22.045693967 +0000 UTC m=+117.617726892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs") pod "router-default-744b6c97cd-ktnzw" (UID: "e601bc85-e6ec-4e42-8728-90ec8be4699c") : secret "router-metrics-certs-default" not found Apr 21 15:37:20.449079 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:20.449041 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gj5kb\" (UID: \"3beca87f-1e7f-40a9-bd98-faa9ab860ad7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:20.449554 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:20.449218 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:37:20.449554 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:20.449295 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls podName:3beca87f-1e7f-40a9-bd98-faa9ab860ad7 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:22.449273528 +0000 UTC m=+118.021306454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gj5kb" (UID: "3beca87f-1e7f-40a9-bd98-faa9ab860ad7") : secret "samples-operator-tls" not found Apr 21 15:37:22.064668 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:22.064626 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:22.065134 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:22.064694 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:22.065134 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:22.064808 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 15:37:22.065134 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:22.064831 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle podName:e601bc85-e6ec-4e42-8728-90ec8be4699c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:26.064807751 +0000 UTC m=+121.636840677 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle") pod "router-default-744b6c97cd-ktnzw" (UID: "e601bc85-e6ec-4e42-8728-90ec8be4699c") : configmap references non-existent config key: service-ca.crt Apr 21 15:37:22.065134 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:22.064876 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs podName:e601bc85-e6ec-4e42-8728-90ec8be4699c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:26.064857623 +0000 UTC m=+121.636890549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs") pod "router-default-744b6c97cd-ktnzw" (UID: "e601bc85-e6ec-4e42-8728-90ec8be4699c") : secret "router-metrics-certs-default" not found Apr 21 15:37:22.445997 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:22.445963 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" event={"ID":"53a049f2-3e84-402e-91d3-5911909aa995","Type":"ContainerStarted","Data":"5af54804d171ec5bdb52c46a9aba341b6d40d56db78a71a51eb3cd9a8d0ebf69"} Apr 21 15:37:22.447497 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:22.447451 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" event={"ID":"f4e9ee47-7719-418f-90bf-ada2af6eab08","Type":"ContainerStarted","Data":"ff73da9c914d8b48ab62707b6ccbcaf1dc60519ac06589b93e48c5deea0c39d6"} Apr 21 15:37:22.448888 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:22.448865 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mbdj9" event={"ID":"9b8e8467-e89b-4cd1-b772-d34e71416962","Type":"ContainerStarted","Data":"5f6b441fe8aa508a2cf3f8696bccecb0fe355dbb04922c3466db54146cc8d25b"} Apr 21 15:37:22.468315 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:22.468263 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" podStartSLOduration=1.813720124 podStartE2EDuration="4.468250897s" podCreationTimestamp="2026-04-21 15:37:18 +0000 UTC" firstStartedPulling="2026-04-21 15:37:19.205544991 +0000 UTC m=+114.777577925" lastFinishedPulling="2026-04-21 15:37:21.860075778 +0000 UTC m=+117.432108698" observedRunningTime="2026-04-21 15:37:22.467095482 +0000 UTC m=+118.039128430" watchObservedRunningTime="2026-04-21 15:37:22.468250897 +0000 UTC m=+118.040283839" Apr 21 15:37:22.468315 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:22.468308 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gj5kb\" (UID: \"3beca87f-1e7f-40a9-bd98-faa9ab860ad7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:22.468580 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:22.468468 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:37:22.468580 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:22.468550 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls podName:3beca87f-1e7f-40a9-bd98-faa9ab860ad7 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:26.468532262 +0000 UTC m=+122.040565190 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gj5kb" (UID: "3beca87f-1e7f-40a9-bd98-faa9ab860ad7") : secret "samples-operator-tls" not found Apr 21 15:37:22.496950 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:22.496888 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" podStartSLOduration=1.825762721 podStartE2EDuration="4.496868698s" podCreationTimestamp="2026-04-21 15:37:18 +0000 UTC" firstStartedPulling="2026-04-21 15:37:19.18930753 +0000 UTC m=+114.761340464" lastFinishedPulling="2026-04-21 15:37:21.860413521 +0000 UTC m=+117.432446441" observedRunningTime="2026-04-21 15:37:22.495768999 +0000 UTC m=+118.067801945" watchObservedRunningTime="2026-04-21 15:37:22.496868698 +0000 UTC m=+118.068901644" Apr 21 15:37:22.520251 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:22.520191 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-mbdj9" podStartSLOduration=1.466330989 podStartE2EDuration="4.520171832s" podCreationTimestamp="2026-04-21 15:37:18 +0000 UTC" firstStartedPulling="2026-04-21 15:37:18.799009767 +0000 UTC m=+114.371042688" lastFinishedPulling="2026-04-21 15:37:21.8528506 +0000 UTC m=+117.424883531" observedRunningTime="2026-04-21 15:37:22.51801287 +0000 UTC m=+118.090045814" watchObservedRunningTime="2026-04-21 15:37:22.520171832 +0000 UTC m=+118.092204772" Apr 21 15:37:26.096028 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:26.095973 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:26.096534 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:26.096076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:26.096534 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:26.096142 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle podName:e601bc85-e6ec-4e42-8728-90ec8be4699c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:34.096125016 +0000 UTC m=+129.668157937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle") pod "router-default-744b6c97cd-ktnzw" (UID: "e601bc85-e6ec-4e42-8728-90ec8be4699c") : configmap references non-existent config key: service-ca.crt Apr 21 15:37:26.096534 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:26.096174 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 15:37:26.096534 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:26.096236 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs podName:e601bc85-e6ec-4e42-8728-90ec8be4699c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:34.096221903 +0000 UTC m=+129.668254829 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs") pod "router-default-744b6c97cd-ktnzw" (UID: "e601bc85-e6ec-4e42-8728-90ec8be4699c") : secret "router-metrics-certs-default" not found Apr 21 15:37:26.213736 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:26.213708 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9b9sq_50b93a7e-ace3-4153-b2d3-ea527a654b34/dns-node-resolver/0.log" Apr 21 15:37:26.498359 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:26.498321 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gj5kb\" (UID: \"3beca87f-1e7f-40a9-bd98-faa9ab860ad7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:26.498548 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:26.498496 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:37:26.498591 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:26.498557 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls podName:3beca87f-1e7f-40a9-bd98-faa9ab860ad7 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:34.498540087 +0000 UTC m=+130.070573011 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gj5kb" (UID: "3beca87f-1e7f-40a9-bd98-faa9ab860ad7") : secret "samples-operator-tls" not found Apr 21 15:37:27.213680 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:27.213649 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jddzn_de4d7367-97b7-475a-b70f-d1b1f47d5106/node-ca/0.log" Apr 21 15:37:28.816564 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:28.816528 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qdkns_f4e9ee47-7719-418f-90bf-ada2af6eab08/kube-storage-version-migrator-operator/0.log" Apr 21 15:37:34.161173 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:34.161134 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:34.161601 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:34.161214 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:34.161601 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:34.161311 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle podName:e601bc85-e6ec-4e42-8728-90ec8be4699c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:50.161293797 +0000 UTC m=+145.733326717 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle") pod "router-default-744b6c97cd-ktnzw" (UID: "e601bc85-e6ec-4e42-8728-90ec8be4699c") : configmap references non-existent config key: service-ca.crt Apr 21 15:37:34.163524 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:34.163507 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e601bc85-e6ec-4e42-8728-90ec8be4699c-metrics-certs\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:34.564321 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:34.564281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gj5kb\" (UID: \"3beca87f-1e7f-40a9-bd98-faa9ab860ad7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:34.566642 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:34.566623 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3beca87f-1e7f-40a9-bd98-faa9ab860ad7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gj5kb\" (UID: \"3beca87f-1e7f-40a9-bd98-faa9ab860ad7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:34.656566 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:34.656531 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-9ztdz\"" Apr 21 15:37:34.662006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:34.661983 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" Apr 21 15:37:34.765974 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:34.765940 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:37:34.766115 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:34.766049 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:37:34.766160 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:34.766117 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs podName:893ee07d-ac5e-4593-93fd-80655b690072 nodeName:}" failed. No retries permitted until 2026-04-21 15:39:36.766100322 +0000 UTC m=+252.338133248 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs") pod "network-metrics-daemon-lp64c" (UID: "893ee07d-ac5e-4593-93fd-80655b690072") : secret "metrics-daemon-secret" not found Apr 21 15:37:34.793427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:34.791036 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb"] Apr 21 15:37:35.482891 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:35.482850 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" event={"ID":"3beca87f-1e7f-40a9-bd98-faa9ab860ad7","Type":"ContainerStarted","Data":"9ba5737cef51e82977ba63ad819e1cb1fbc5a46484b8c3b06ef51bfb143d4303"} Apr 21 15:37:37.489739 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:37.489702 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" event={"ID":"3beca87f-1e7f-40a9-bd98-faa9ab860ad7","Type":"ContainerStarted","Data":"43a6bcbf7a37ee2aec4df9cc1c71210192f318ce2108583a932d99462854c65b"} Apr 21 15:37:37.489739 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:37.489739 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" event={"ID":"3beca87f-1e7f-40a9-bd98-faa9ab860ad7","Type":"ContainerStarted","Data":"220efe67fa24dc20f61809585f0ad6b594f3ffe06c3a4946b4477daa5794aaa3"} Apr 21 15:37:37.508554 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:37.508499 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gj5kb" podStartSLOduration=17.727380586 podStartE2EDuration="19.508463136s" podCreationTimestamp="2026-04-21 15:37:18 +0000 UTC" firstStartedPulling="2026-04-21 15:37:34.839270402 +0000 UTC m=+130.411303326" lastFinishedPulling="2026-04-21 15:37:36.620352952 +0000 UTC m=+132.192385876" observedRunningTime="2026-04-21 15:37:37.507714768 +0000 UTC m=+133.079747712" watchObservedRunningTime="2026-04-21 15:37:37.508463136 +0000 UTC m=+133.080496079" Apr 21 15:37:45.514548 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.514510 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd"] Apr 21 15:37:45.518314 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.518289 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd" Apr 21 15:37:45.521782 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.521758 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 15:37:45.522800 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.522778 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-5dx25\"" Apr 21 15:37:45.522888 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.522783 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 15:37:45.529612 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.529588 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hpj5v"] Apr 21 15:37:45.533321 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.533299 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd"] Apr 21 15:37:45.533420 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.533411 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.538665 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.538641 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 15:37:45.538798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.538701 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 15:37:45.538920 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.538902 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-shndr\"" Apr 21 15:37:45.557674 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.557642 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hpj5v"] Apr 21 15:37:45.651010 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.650973 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/999507b2-ef4d-48d7-acfe-a00e4f249573-crio-socket\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.651176 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.651014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/999507b2-ef4d-48d7-acfe-a00e4f249573-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.651176 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.651041 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/999507b2-ef4d-48d7-acfe-a00e4f249573-data-volume\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.651304 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.651181 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/999507b2-ef4d-48d7-acfe-a00e4f249573-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.651304 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.651210 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/54dec317-f338-45be-84df-5116ae87636c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qvkhd\" (UID: \"54dec317-f338-45be-84df-5116ae87636c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd" Apr 21 15:37:45.651304 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.651243 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kwjk\" (UniqueName: \"kubernetes.io/projected/999507b2-ef4d-48d7-acfe-a00e4f249573-kube-api-access-6kwjk\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.651399 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.651340 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/54dec317-f338-45be-84df-5116ae87636c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qvkhd\" (UID: \"54dec317-f338-45be-84df-5116ae87636c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd" Apr 21 15:37:45.654299 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.654277 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-74dfb5f878-97xdw"] Apr 21 15:37:45.657349 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.657335 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.660554 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.660524 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 15:37:45.660554 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.660529 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kwwxq\"" Apr 21 15:37:45.660878 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.660859 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 15:37:45.660952 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.660859 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 15:37:45.667132 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.667110 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 15:37:45.682863 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.682837 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-74dfb5f878-97xdw"] Apr 21 15:37:45.752445 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752409 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/999507b2-ef4d-48d7-acfe-a00e4f249573-crio-socket\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.752445 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752441 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/999507b2-ef4d-48d7-acfe-a00e4f249573-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.752665 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752463 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/23290611-2011-4b35-851e-ccedc88a9391-image-registry-private-configuration\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.752665 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752505 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/999507b2-ef4d-48d7-acfe-a00e4f249573-data-volume\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.752665 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752543 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/999507b2-ef4d-48d7-acfe-a00e4f249573-crio-socket\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.752665 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752560 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23290611-2011-4b35-851e-ccedc88a9391-ca-trust-extracted\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.752665 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752598 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np5qx\" (UniqueName: \"kubernetes.io/projected/23290611-2011-4b35-851e-ccedc88a9391-kube-api-access-np5qx\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.752665 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752645 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23290611-2011-4b35-851e-ccedc88a9391-registry-tls\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.752853 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/999507b2-ef4d-48d7-acfe-a00e4f249573-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.752853 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752781 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/54dec317-f338-45be-84df-5116ae87636c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qvkhd\" (UID: \"54dec317-f338-45be-84df-5116ae87636c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd" Apr 21 15:37:45.752853 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752808 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/999507b2-ef4d-48d7-acfe-a00e4f249573-data-volume\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.752853 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752818 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kwjk\" (UniqueName: \"kubernetes.io/projected/999507b2-ef4d-48d7-acfe-a00e4f249573-kube-api-access-6kwjk\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.753026 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752897 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23290611-2011-4b35-851e-ccedc88a9391-installation-pull-secrets\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.753026 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752938 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23290611-2011-4b35-851e-ccedc88a9391-registry-certificates\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.753026 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.752966 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23290611-2011-4b35-851e-ccedc88a9391-trusted-ca\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.753026 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.753020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/54dec317-f338-45be-84df-5116ae87636c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qvkhd\" (UID: \"54dec317-f338-45be-84df-5116ae87636c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd" Apr 21 15:37:45.753206 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.753062 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23290611-2011-4b35-851e-ccedc88a9391-bound-sa-token\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.753206 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.753151 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/999507b2-ef4d-48d7-acfe-a00e4f249573-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.753607 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.753588 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/54dec317-f338-45be-84df-5116ae87636c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qvkhd\" (UID: \"54dec317-f338-45be-84df-5116ae87636c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd" Apr 21 15:37:45.755040 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.755019 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/54dec317-f338-45be-84df-5116ae87636c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qvkhd\" (UID: \"54dec317-f338-45be-84df-5116ae87636c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd" Apr 21 15:37:45.755121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.755022 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/999507b2-ef4d-48d7-acfe-a00e4f249573-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.764721 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.764671 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kwjk\" (UniqueName: \"kubernetes.io/projected/999507b2-ef4d-48d7-acfe-a00e4f249573-kube-api-access-6kwjk\") pod \"insights-runtime-extractor-hpj5v\" (UID: \"999507b2-ef4d-48d7-acfe-a00e4f249573\") " pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.826521 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.826461 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd" Apr 21 15:37:45.842537 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.842508 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hpj5v" Apr 21 15:37:45.854383 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.854357 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23290611-2011-4b35-851e-ccedc88a9391-installation-pull-secrets\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.854503 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.854397 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23290611-2011-4b35-851e-ccedc88a9391-registry-certificates\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.854571 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.854506 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23290611-2011-4b35-851e-ccedc88a9391-trusted-ca\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.854571 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.854540 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23290611-2011-4b35-851e-ccedc88a9391-bound-sa-token\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.854571 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.854567 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/23290611-2011-4b35-851e-ccedc88a9391-image-registry-private-configuration\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.854716 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.854595 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23290611-2011-4b35-851e-ccedc88a9391-ca-trust-extracted\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.854716 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.854619 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-np5qx\" (UniqueName: \"kubernetes.io/projected/23290611-2011-4b35-851e-ccedc88a9391-kube-api-access-np5qx\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.854716 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.854676 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23290611-2011-4b35-851e-ccedc88a9391-registry-tls\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.855229 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.855164 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23290611-2011-4b35-851e-ccedc88a9391-ca-trust-extracted\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.855391 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.855366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23290611-2011-4b35-851e-ccedc88a9391-registry-certificates\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.855784 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.855749 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23290611-2011-4b35-851e-ccedc88a9391-trusted-ca\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.857852 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.857831 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23290611-2011-4b35-851e-ccedc88a9391-installation-pull-secrets\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.857912 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.857863 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23290611-2011-4b35-851e-ccedc88a9391-registry-tls\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.858148 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.858121 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/23290611-2011-4b35-851e-ccedc88a9391-image-registry-private-configuration\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.873088 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.873033 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23290611-2011-4b35-851e-ccedc88a9391-bound-sa-token\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.873680 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.873640 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-np5qx\" (UniqueName: \"kubernetes.io/projected/23290611-2011-4b35-851e-ccedc88a9391-kube-api-access-np5qx\") pod \"image-registry-74dfb5f878-97xdw\" (UID: \"23290611-2011-4b35-851e-ccedc88a9391\") " pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.962608 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.962578 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd"] Apr 21 15:37:45.964925 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:37:45.964891 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54dec317_f338_45be_84df_5116ae87636c.slice/crio-e77adf7b72b5accef3007106d64727ccfcb653ecb65b1d2c99171a8237ebc4d8 WatchSource:0}: Error finding container e77adf7b72b5accef3007106d64727ccfcb653ecb65b1d2c99171a8237ebc4d8: Status 404 returned error can't find the container with id e77adf7b72b5accef3007106d64727ccfcb653ecb65b1d2c99171a8237ebc4d8 Apr 21 15:37:45.966657 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.966578 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:45.985727 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:45.985699 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hpj5v"] Apr 21 15:37:45.989109 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:37:45.989079 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod999507b2_ef4d_48d7_acfe_a00e4f249573.slice/crio-9f92a26546a6b331efb638566e65204e005e8a691202a9ac106820df83c0aed3 WatchSource:0}: Error finding container 9f92a26546a6b331efb638566e65204e005e8a691202a9ac106820df83c0aed3: Status 404 returned error can't find the container with id 9f92a26546a6b331efb638566e65204e005e8a691202a9ac106820df83c0aed3 Apr 21 15:37:46.110564 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:46.110532 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-74dfb5f878-97xdw"] Apr 21 15:37:46.113582 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:37:46.113550 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23290611_2011_4b35_851e_ccedc88a9391.slice/crio-c5851c11fb20371f6b37ee55c762d9070156d2abb0ccbe61f2f533917c8040dc WatchSource:0}: Error finding container c5851c11fb20371f6b37ee55c762d9070156d2abb0ccbe61f2f533917c8040dc: Status 404 returned error can't find the container with id c5851c11fb20371f6b37ee55c762d9070156d2abb0ccbe61f2f533917c8040dc Apr 21 15:37:46.517142 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:46.517099 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" event={"ID":"23290611-2011-4b35-851e-ccedc88a9391","Type":"ContainerStarted","Data":"bad067aabca48a4a908335da5ae59c917f51214553c0a5549d01c374a24ddbb1"} Apr 21 15:37:46.517142 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:46.517144 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" event={"ID":"23290611-2011-4b35-851e-ccedc88a9391","Type":"ContainerStarted","Data":"c5851c11fb20371f6b37ee55c762d9070156d2abb0ccbe61f2f533917c8040dc"} Apr 21 15:37:46.517657 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:46.517202 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:37:46.519187 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:46.519158 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hpj5v" event={"ID":"999507b2-ef4d-48d7-acfe-a00e4f249573","Type":"ContainerStarted","Data":"40d80a69f7b5accc90e818e9ba11fcee761fca0a4a04463778e9a54a34af7f7d"} Apr 21 15:37:46.519342 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:46.519193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hpj5v" event={"ID":"999507b2-ef4d-48d7-acfe-a00e4f249573","Type":"ContainerStarted","Data":"9f92a26546a6b331efb638566e65204e005e8a691202a9ac106820df83c0aed3"} Apr 21 15:37:46.520332 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:46.520305 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd" event={"ID":"54dec317-f338-45be-84df-5116ae87636c","Type":"ContainerStarted","Data":"e77adf7b72b5accef3007106d64727ccfcb653ecb65b1d2c99171a8237ebc4d8"} Apr 21 15:37:46.656666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:46.656602 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" podStartSLOduration=1.656581938 podStartE2EDuration="1.656581938s" podCreationTimestamp="2026-04-21 15:37:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:37:46.655042761 +0000 UTC m=+142.227075707" watchObservedRunningTime="2026-04-21 15:37:46.656581938 +0000 UTC m=+142.228614882" Apr 21 15:37:47.524582 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:47.524543 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hpj5v" event={"ID":"999507b2-ef4d-48d7-acfe-a00e4f249573","Type":"ContainerStarted","Data":"f759e09c6669ec5486ab40e20366618417935b772e4bc4c221fe7841019bb5ce"} Apr 21 15:37:47.525877 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:47.525852 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd" event={"ID":"54dec317-f338-45be-84df-5116ae87636c","Type":"ContainerStarted","Data":"80face44499e16f3ba7a343cf097d57110153e11a7019f379081f4f234231327"} Apr 21 15:37:47.544907 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:47.544849 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qvkhd" podStartSLOduration=1.5390005960000002 podStartE2EDuration="2.544833555s" podCreationTimestamp="2026-04-21 15:37:45 +0000 UTC" firstStartedPulling="2026-04-21 15:37:45.967039535 +0000 UTC m=+141.539072456" lastFinishedPulling="2026-04-21 15:37:46.972872494 +0000 UTC m=+142.544905415" observedRunningTime="2026-04-21 15:37:47.543514608 +0000 UTC m=+143.115547555" watchObservedRunningTime="2026-04-21 15:37:47.544833555 +0000 UTC m=+143.116866499" Apr 21 15:37:48.530989 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:48.530947 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hpj5v" event={"ID":"999507b2-ef4d-48d7-acfe-a00e4f249573","Type":"ContainerStarted","Data":"d938472781b1c51a1f69f6d455d9c778140edd31018b67c73bded462d3d87807"} Apr 21 15:37:48.553632 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:48.553579 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hpj5v" podStartSLOduration=1.5203121 podStartE2EDuration="3.553563509s" podCreationTimestamp="2026-04-21 15:37:45 +0000 UTC" firstStartedPulling="2026-04-21 15:37:46.056885152 +0000 UTC m=+141.628918088" lastFinishedPulling="2026-04-21 15:37:48.090136572 +0000 UTC m=+143.662169497" observedRunningTime="2026-04-21 15:37:48.551825109 +0000 UTC m=+144.123858076" watchObservedRunningTime="2026-04-21 15:37:48.553563509 +0000 UTC m=+144.125596455" Apr 21 15:37:50.195938 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:50.195896 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:50.196585 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:50.196565 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e601bc85-e6ec-4e42-8728-90ec8be4699c-service-ca-bundle\") pod \"router-default-744b6c97cd-ktnzw\" (UID: \"e601bc85-e6ec-4e42-8728-90ec8be4699c\") " pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:50.465699 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:50.465617 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-jgbz7\"" Apr 21 15:37:50.473093 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:50.473069 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:50.594115 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:50.594081 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-744b6c97cd-ktnzw"] Apr 21 15:37:50.597548 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:37:50.597515 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode601bc85_e6ec_4e42_8728_90ec8be4699c.slice/crio-19478e8054f2cd2f8acce7e0dfa568d20f708c1532021e9bd98ae9d63cb447a9 WatchSource:0}: Error finding container 19478e8054f2cd2f8acce7e0dfa568d20f708c1532021e9bd98ae9d63cb447a9: Status 404 returned error can't find the container with id 19478e8054f2cd2f8acce7e0dfa568d20f708c1532021e9bd98ae9d63cb447a9 Apr 21 15:37:51.539889 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:51.539856 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-744b6c97cd-ktnzw" event={"ID":"e601bc85-e6ec-4e42-8728-90ec8be4699c","Type":"ContainerStarted","Data":"b0434b91fa3094b9cab328b9313649250c3a69e3e443f24ac6f45fcb0449428d"} Apr 21 15:37:51.539889 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:51.539895 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-744b6c97cd-ktnzw" event={"ID":"e601bc85-e6ec-4e42-8728-90ec8be4699c","Type":"ContainerStarted","Data":"19478e8054f2cd2f8acce7e0dfa568d20f708c1532021e9bd98ae9d63cb447a9"} Apr 21 15:37:51.566248 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:51.566199 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-744b6c97cd-ktnzw" podStartSLOduration=33.566182881 podStartE2EDuration="33.566182881s" podCreationTimestamp="2026-04-21 15:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:37:51.565961649 +0000 UTC m=+147.137994596" watchObservedRunningTime="2026-04-21 15:37:51.566182881 +0000 UTC m=+147.138215820" Apr 21 15:37:52.413279 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:52.413245 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8"] Apr 21 15:37:52.417415 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:52.417397 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8" Apr 21 15:37:52.422251 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:52.422222 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 15:37:52.422378 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:52.422282 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-9dljf\"" Apr 21 15:37:52.473935 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:52.473896 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:52.476516 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:52.476492 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:52.503976 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:52.503942 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8"] Apr 21 15:37:52.512980 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:52.512956 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1e991f09-4d79-4d2a-9195-adc43e8fcfc4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-522f8\" (UID: \"1e991f09-4d79-4d2a-9195-adc43e8fcfc4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8" Apr 21 15:37:52.543175 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:52.543151 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:52.544443 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:52.544419 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-744b6c97cd-ktnzw" Apr 21 15:37:52.614155 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:52.614122 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1e991f09-4d79-4d2a-9195-adc43e8fcfc4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-522f8\" (UID: \"1e991f09-4d79-4d2a-9195-adc43e8fcfc4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8" Apr 21 15:37:52.614390 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:52.614363 2569 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 21 15:37:52.614653 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:52.614634 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e991f09-4d79-4d2a-9195-adc43e8fcfc4-tls-certificates podName:1e991f09-4d79-4d2a-9195-adc43e8fcfc4 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:53.114609177 +0000 UTC m=+148.686642118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/1e991f09-4d79-4d2a-9195-adc43e8fcfc4-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-522f8" (UID: "1e991f09-4d79-4d2a-9195-adc43e8fcfc4") : secret "prometheus-operator-admission-webhook-tls" not found Apr 21 15:37:53.117013 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:53.116919 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1e991f09-4d79-4d2a-9195-adc43e8fcfc4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-522f8\" (UID: \"1e991f09-4d79-4d2a-9195-adc43e8fcfc4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8" Apr 21 15:37:53.119347 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:53.119314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1e991f09-4d79-4d2a-9195-adc43e8fcfc4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-522f8\" (UID: \"1e991f09-4d79-4d2a-9195-adc43e8fcfc4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8" Apr 21 15:37:53.326591 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:53.326549 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8" Apr 21 15:37:53.454864 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:53.454831 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8"] Apr 21 15:37:53.458773 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:37:53.458747 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e991f09_4d79_4d2a_9195_adc43e8fcfc4.slice/crio-63144f3332534d42e1df7a10fcddb074350456ead79004a6ac2868b46478c5a7 WatchSource:0}: Error finding container 63144f3332534d42e1df7a10fcddb074350456ead79004a6ac2868b46478c5a7: Status 404 returned error can't find the container with id 63144f3332534d42e1df7a10fcddb074350456ead79004a6ac2868b46478c5a7 Apr 21 15:37:53.546985 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:53.546949 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8" event={"ID":"1e991f09-4d79-4d2a-9195-adc43e8fcfc4","Type":"ContainerStarted","Data":"63144f3332534d42e1df7a10fcddb074350456ead79004a6ac2868b46478c5a7"} Apr 21 15:37:54.551019 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:54.550977 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8" event={"ID":"1e991f09-4d79-4d2a-9195-adc43e8fcfc4","Type":"ContainerStarted","Data":"0ce87dc5e2c7860b8f04b07e43a4bab01be6deb27eb8f67ddbc4a3246927381a"} Apr 21 15:37:54.551424 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:54.551240 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8" Apr 21 15:37:54.556186 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:54.556159 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8" Apr 21 15:37:54.573985 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:54.573937 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-522f8" podStartSLOduration=1.645866587 podStartE2EDuration="2.573922898s" podCreationTimestamp="2026-04-21 15:37:52 +0000 UTC" firstStartedPulling="2026-04-21 15:37:53.460660523 +0000 UTC m=+149.032693444" lastFinishedPulling="2026-04-21 15:37:54.38871683 +0000 UTC m=+149.960749755" observedRunningTime="2026-04-21 15:37:54.572210784 +0000 UTC m=+150.144243726" watchObservedRunningTime="2026-04-21 15:37:54.573922898 +0000 UTC m=+150.145955841" Apr 21 15:37:55.515671 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.515636 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rmh8w"] Apr 21 15:37:55.518829 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.518811 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:55.522213 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.522192 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 15:37:55.523279 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.523263 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 15:37:55.523554 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.523532 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 15:37:55.523656 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.523567 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-j5nlm\"" Apr 21 15:37:55.524103 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.524081 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 15:37:55.525767 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.525593 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 15:37:55.534143 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.534116 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/110cb247-6c78-4666-aba6-4a6ac658c728-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:55.534250 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.534152 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/110cb247-6c78-4666-aba6-4a6ac658c728-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:55.534250 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.534177 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr2wm\" (UniqueName: \"kubernetes.io/projected/110cb247-6c78-4666-aba6-4a6ac658c728-kube-api-access-tr2wm\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:55.534250 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.534216 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/110cb247-6c78-4666-aba6-4a6ac658c728-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:55.543812 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.543785 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rmh8w"] Apr 21 15:37:55.634994 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.634947 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/110cb247-6c78-4666-aba6-4a6ac658c728-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:55.635425 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.635055 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/110cb247-6c78-4666-aba6-4a6ac658c728-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:55.635425 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.635077 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tr2wm\" (UniqueName: \"kubernetes.io/projected/110cb247-6c78-4666-aba6-4a6ac658c728-kube-api-access-tr2wm\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:55.635425 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.635104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/110cb247-6c78-4666-aba6-4a6ac658c728-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:55.635425 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:55.635192 2569 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 15:37:55.635425 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:37:55.635242 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/110cb247-6c78-4666-aba6-4a6ac658c728-prometheus-operator-tls podName:110cb247-6c78-4666-aba6-4a6ac658c728 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:56.13522615 +0000 UTC m=+151.707259075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/110cb247-6c78-4666-aba6-4a6ac658c728-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-rmh8w" (UID: "110cb247-6c78-4666-aba6-4a6ac658c728") : secret "prometheus-operator-tls" not found Apr 21 15:37:55.635871 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.635847 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/110cb247-6c78-4666-aba6-4a6ac658c728-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:55.637384 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.637366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/110cb247-6c78-4666-aba6-4a6ac658c728-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:55.648406 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:55.648378 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr2wm\" (UniqueName: \"kubernetes.io/projected/110cb247-6c78-4666-aba6-4a6ac658c728-kube-api-access-tr2wm\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:56.138707 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:56.138663 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/110cb247-6c78-4666-aba6-4a6ac658c728-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:56.141094 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:56.141056 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/110cb247-6c78-4666-aba6-4a6ac658c728-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rmh8w\" (UID: \"110cb247-6c78-4666-aba6-4a6ac658c728\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:56.428148 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:56.428117 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" Apr 21 15:37:56.552261 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:56.552219 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rmh8w"] Apr 21 15:37:56.566981 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:37:56.566952 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110cb247_6c78_4666_aba6_4a6ac658c728.slice/crio-795422f3f95b8c2b492ccf564bed1b4a4b7eaf538a5d2ac01bc662337e57557b WatchSource:0}: Error finding container 795422f3f95b8c2b492ccf564bed1b4a4b7eaf538a5d2ac01bc662337e57557b: Status 404 returned error can't find the container with id 795422f3f95b8c2b492ccf564bed1b4a4b7eaf538a5d2ac01bc662337e57557b Apr 21 15:37:57.560591 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:57.560542 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" event={"ID":"110cb247-6c78-4666-aba6-4a6ac658c728","Type":"ContainerStarted","Data":"795422f3f95b8c2b492ccf564bed1b4a4b7eaf538a5d2ac01bc662337e57557b"} Apr 21 15:37:58.564646 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:58.564612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" event={"ID":"110cb247-6c78-4666-aba6-4a6ac658c728","Type":"ContainerStarted","Data":"3baf7bce292b5d26059e2ccf191a0c2a6fe1f71c3fc076bd20d75a9e818fe891"} Apr 21 15:37:58.565014 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:58.564654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" event={"ID":"110cb247-6c78-4666-aba6-4a6ac658c728","Type":"ContainerStarted","Data":"db5ffc655ca13938403e8ddb71158362b93ced54a7e00cf4ca6c210cd382e56e"} Apr 21 15:37:58.593359 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:37:58.593306 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmh8w" podStartSLOduration=2.296408787 podStartE2EDuration="3.593291232s" podCreationTimestamp="2026-04-21 15:37:55 +0000 UTC" firstStartedPulling="2026-04-21 15:37:56.568652911 +0000 UTC m=+152.140685836" lastFinishedPulling="2026-04-21 15:37:57.86553536 +0000 UTC m=+153.437568281" observedRunningTime="2026-04-21 15:37:58.590523383 +0000 UTC m=+154.162556325" watchObservedRunningTime="2026-04-21 15:37:58.593291232 +0000 UTC m=+154.165324175" Apr 21 15:38:00.940110 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.940075 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g"] Apr 21 15:38:00.943547 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.943530 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:00.946114 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.946090 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-j6gf8\"" Apr 21 15:38:00.946389 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.946202 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 15:38:00.946389 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.946242 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 15:38:00.959310 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.959283 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g"] Apr 21 15:38:00.982845 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.982807 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f42786e0-4a65-43c9-a735-126e01bd577a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:00.983004 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.982872 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f42786e0-4a65-43c9-a735-126e01bd577a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:00.983004 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.982923 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7s2s\" (UniqueName: \"kubernetes.io/projected/f42786e0-4a65-43c9-a735-126e01bd577a-kube-api-access-q7s2s\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:00.983004 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.982961 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f42786e0-4a65-43c9-a735-126e01bd577a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:00.984266 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.984245 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-s64zs"] Apr 21 15:38:00.987383 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.987368 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:00.993049 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.993024 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 15:38:00.993262 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.993101 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 15:38:00.993262 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.993133 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 15:38:00.993880 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:00.993855 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-52fql\"" Apr 21 15:38:01.084254 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f42786e0-4a65-43c9-a735-126e01bd577a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:01.084450 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084263 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-metrics-client-ca\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.084450 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084293 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7s2s\" (UniqueName: \"kubernetes.io/projected/f42786e0-4a65-43c9-a735-126e01bd577a-kube-api-access-q7s2s\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:01.084450 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084341 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-textfile\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.084450 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084420 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-root\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.084669 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084449 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.084669 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084526 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-accelerators-collector-config\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.084669 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084581 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrdwz\" (UniqueName: \"kubernetes.io/projected/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-kube-api-access-jrdwz\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.084669 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084614 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f42786e0-4a65-43c9-a735-126e01bd577a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:01.084818 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084685 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f42786e0-4a65-43c9-a735-126e01bd577a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:01.084818 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084708 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-sys\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.084818 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084728 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-tls\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.084818 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.084744 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-wtmp\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.084945 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:38:01.084816 2569 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 15:38:01.084945 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:38:01.084868 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f42786e0-4a65-43c9-a735-126e01bd577a-openshift-state-metrics-tls podName:f42786e0-4a65-43c9-a735-126e01bd577a nodeName:}" failed. No retries permitted until 2026-04-21 15:38:01.584851243 +0000 UTC m=+157.156884165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f42786e0-4a65-43c9-a735-126e01bd577a-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-s9g9g" (UID: "f42786e0-4a65-43c9-a735-126e01bd577a") : secret "openshift-state-metrics-tls" not found Apr 21 15:38:01.085037 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.085021 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f42786e0-4a65-43c9-a735-126e01bd577a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:01.086887 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.086867 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f42786e0-4a65-43c9-a735-126e01bd577a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:01.109340 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.109305 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7s2s\" (UniqueName: \"kubernetes.io/projected/f42786e0-4a65-43c9-a735-126e01bd577a-kube-api-access-q7s2s\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:01.185176 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185137 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-metrics-client-ca\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185176 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-textfile\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185428 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-root\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185428 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185315 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-root\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185428 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185428 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185375 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-accelerators-collector-config\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185428 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185405 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrdwz\" (UniqueName: \"kubernetes.io/projected/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-kube-api-access-jrdwz\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185617 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185531 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-sys\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185617 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185559 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-tls\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185617 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185578 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-wtmp\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185706 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185644 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-sys\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185706 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:38:01.185678 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 15:38:01.185706 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185700 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-wtmp\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.185811 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:38:01.185731 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-tls podName:cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c nodeName:}" failed. No retries permitted until 2026-04-21 15:38:01.685715667 +0000 UTC m=+157.257748589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-tls") pod "node-exporter-s64zs" (UID: "cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c") : secret "node-exporter-tls" not found Apr 21 15:38:01.185999 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.185980 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-textfile\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.186248 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.186226 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-metrics-client-ca\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.186314 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.186264 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-accelerators-collector-config\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.187964 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.187948 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.202337 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.202281 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrdwz\" (UniqueName: \"kubernetes.io/projected/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-kube-api-access-jrdwz\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.396202 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:38:01.396154 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-gdwg8" podUID="cb43fce4-df4e-4cca-a455-90d323512faf" Apr 21 15:38:01.408347 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:38:01.408311 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jpjds" podUID="fffa5175-92d5-48ec-a153-baf3f061b044" Apr 21 15:38:01.572394 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.572299 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:38:01.572394 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.572332 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gdwg8" Apr 21 15:38:01.589104 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.589065 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f42786e0-4a65-43c9-a735-126e01bd577a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:01.591601 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.591571 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f42786e0-4a65-43c9-a735-126e01bd577a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-s9g9g\" (UID: \"f42786e0-4a65-43c9-a735-126e01bd577a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:01.689998 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.689949 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-tls\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.692145 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.692116 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c-node-exporter-tls\") pod \"node-exporter-s64zs\" (UID: \"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c\") " pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.852758 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.852670 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" Apr 21 15:38:01.898575 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.897410 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s64zs" Apr 21 15:38:01.909619 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:38:01.909582 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac7a4e2_e818_4cbc_9a4d_c5bcd0f4340c.slice/crio-61b7237c00899dede5b5f6a5d3364f83d685d9a9be5a035d38e06bb463d4ac35 WatchSource:0}: Error finding container 61b7237c00899dede5b5f6a5d3364f83d685d9a9be5a035d38e06bb463d4ac35: Status 404 returned error can't find the container with id 61b7237c00899dede5b5f6a5d3364f83d685d9a9be5a035d38e06bb463d4ac35 Apr 21 15:38:01.984274 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:01.984245 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g"] Apr 21 15:38:01.988111 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:38:01.988071 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf42786e0_4a65_43c9_a735_126e01bd577a.slice/crio-c60a4b7051f5f8a94e511a47559a7bcaa39aaa1b1600b5c2e700c922fc7367b5 WatchSource:0}: Error finding container c60a4b7051f5f8a94e511a47559a7bcaa39aaa1b1600b5c2e700c922fc7367b5: Status 404 returned error can't find the container with id c60a4b7051f5f8a94e511a47559a7bcaa39aaa1b1600b5c2e700c922fc7367b5 Apr 21 15:38:02.576384 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:02.576349 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s64zs" event={"ID":"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c","Type":"ContainerStarted","Data":"61b7237c00899dede5b5f6a5d3364f83d685d9a9be5a035d38e06bb463d4ac35"} Apr 21 15:38:02.578259 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:02.578226 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" event={"ID":"f42786e0-4a65-43c9-a735-126e01bd577a","Type":"ContainerStarted","Data":"826064eac76286363173f0371ee36ab702b8c9dccc0d622140fec05abd22a645"} Apr 21 15:38:02.578378 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:02.578267 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" event={"ID":"f42786e0-4a65-43c9-a735-126e01bd577a","Type":"ContainerStarted","Data":"2709a1ef42d0d3938dfd2fc44f2c6b93a5f2ce464323961e64a5c99c2bb260dd"} Apr 21 15:38:02.578378 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:02.578282 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" event={"ID":"f42786e0-4a65-43c9-a735-126e01bd577a","Type":"ContainerStarted","Data":"c60a4b7051f5f8a94e511a47559a7bcaa39aaa1b1600b5c2e700c922fc7367b5"} Apr 21 15:38:03.070147 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:38:03.070104 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-lp64c" podUID="893ee07d-ac5e-4593-93fd-80655b690072" Apr 21 15:38:03.582633 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:03.582598 2569 generic.go:358] "Generic (PLEG): container finished" podID="cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c" containerID="50662fe5759c1c19a837cbb0ac77fc0aace4fcb950ab306b0f2d06ec197c6bfc" exitCode=0 Apr 21 15:38:03.582822 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:03.582670 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s64zs" event={"ID":"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c","Type":"ContainerDied","Data":"50662fe5759c1c19a837cbb0ac77fc0aace4fcb950ab306b0f2d06ec197c6bfc"} Apr 21 15:38:03.584537 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:03.584512 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" event={"ID":"f42786e0-4a65-43c9-a735-126e01bd577a","Type":"ContainerStarted","Data":"edc52274a632060d213499635c109b23f6caee2615f5db2534136739a1d327d1"} Apr 21 15:38:03.621178 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:03.621120 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s9g9g" podStartSLOduration=2.684604199 podStartE2EDuration="3.621101954s" podCreationTimestamp="2026-04-21 15:38:00 +0000 UTC" firstStartedPulling="2026-04-21 15:38:02.140544563 +0000 UTC m=+157.712577498" lastFinishedPulling="2026-04-21 15:38:03.077042329 +0000 UTC m=+158.649075253" observedRunningTime="2026-04-21 15:38:03.620392812 +0000 UTC m=+159.192425754" watchObservedRunningTime="2026-04-21 15:38:03.621101954 +0000 UTC m=+159.193134897" Apr 21 15:38:04.589268 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:04.589223 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s64zs" event={"ID":"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c","Type":"ContainerStarted","Data":"163c3f13d04e2b07b870a5fa3fc378f7e2c9c523593bf068f231a8f4da8c9821"} Apr 21 15:38:04.589268 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:04.589262 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s64zs" event={"ID":"cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c","Type":"ContainerStarted","Data":"62853ea14c02d9fae5c16d4a47ad0516fc2aa9e0fd825a71583aec82763511c9"} Apr 21 15:38:04.610253 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:04.610199 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-s64zs" podStartSLOduration=3.488163309 podStartE2EDuration="4.610176683s" podCreationTimestamp="2026-04-21 15:38:00 +0000 UTC" firstStartedPulling="2026-04-21 15:38:01.912252518 +0000 UTC m=+157.484285452" lastFinishedPulling="2026-04-21 15:38:03.034265901 +0000 UTC m=+158.606298826" observedRunningTime="2026-04-21 15:38:04.608819545 +0000 UTC m=+160.180852490" watchObservedRunningTime="2026-04-21 15:38:04.610176683 +0000 UTC m=+160.182209625" Apr 21 15:38:05.971146 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:05.971106 2569 patch_prober.go:28] interesting pod/image-registry-74dfb5f878-97xdw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 15:38:05.971573 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:05.971174 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" podUID="23290611-2011-4b35-851e-ccedc88a9391" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:38:06.232501 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.232372 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:38:06.232501 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.232431 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:38:06.234897 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.234864 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb43fce4-df4e-4cca-a455-90d323512faf-metrics-tls\") pod \"dns-default-gdwg8\" (UID: \"cb43fce4-df4e-4cca-a455-90d323512faf\") " pod="openshift-dns/dns-default-gdwg8" Apr 21 15:38:06.235006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.234922 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fffa5175-92d5-48ec-a153-baf3f061b044-cert\") pod \"ingress-canary-jpjds\" (UID: \"fffa5175-92d5-48ec-a153-baf3f061b044\") " pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:38:06.375326 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.375294 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xkwb5\"" Apr 21 15:38:06.376184 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.376159 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7ld2z\"" Apr 21 15:38:06.383181 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.383159 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gdwg8" Apr 21 15:38:06.383283 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.383239 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jpjds" Apr 21 15:38:06.513212 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.513179 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jpjds"] Apr 21 15:38:06.517300 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:38:06.517265 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfffa5175_92d5_48ec_a153_baf3f061b044.slice/crio-153f12aaf2ab1efca17f918ad122da914d4fcb74f9f346f0e566ba53e3185db5 WatchSource:0}: Error finding container 153f12aaf2ab1efca17f918ad122da914d4fcb74f9f346f0e566ba53e3185db5: Status 404 returned error can't find the container with id 153f12aaf2ab1efca17f918ad122da914d4fcb74f9f346f0e566ba53e3185db5 Apr 21 15:38:06.531469 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.531439 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gdwg8"] Apr 21 15:38:06.535896 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:38:06.535864 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb43fce4_df4e_4cca_a455_90d323512faf.slice/crio-a8d1b7bb6253cc0744231484bfc305c1f3d8ddb926a25ee34f0caf1cbf28227d WatchSource:0}: Error finding container a8d1b7bb6253cc0744231484bfc305c1f3d8ddb926a25ee34f0caf1cbf28227d: Status 404 returned error can't find the container with id a8d1b7bb6253cc0744231484bfc305c1f3d8ddb926a25ee34f0caf1cbf28227d Apr 21 15:38:06.597988 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.597945 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jpjds" event={"ID":"fffa5175-92d5-48ec-a153-baf3f061b044","Type":"ContainerStarted","Data":"153f12aaf2ab1efca17f918ad122da914d4fcb74f9f346f0e566ba53e3185db5"} Apr 21 15:38:06.599286 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:06.599258 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gdwg8" event={"ID":"cb43fce4-df4e-4cca-a455-90d323512faf","Type":"ContainerStarted","Data":"a8d1b7bb6253cc0744231484bfc305c1f3d8ddb926a25ee34f0caf1cbf28227d"} Apr 21 15:38:07.532615 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:07.532567 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-74dfb5f878-97xdw" Apr 21 15:38:08.610140 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:08.610106 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jpjds" event={"ID":"fffa5175-92d5-48ec-a153-baf3f061b044","Type":"ContainerStarted","Data":"ba40d01ccc3fa7169ec83ae8718f1751a3f3e0eb5f3bbb8c228357038e07449c"} Apr 21 15:38:08.611740 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:08.611716 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gdwg8" event={"ID":"cb43fce4-df4e-4cca-a455-90d323512faf","Type":"ContainerStarted","Data":"414354f81fc450970882c4629d881767bf0a3e5a3fca7d044846989f26843424"} Apr 21 15:38:08.647085 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:08.647034 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jpjds" podStartSLOduration=128.673925172 podStartE2EDuration="2m10.647017197s" podCreationTimestamp="2026-04-21 15:35:58 +0000 UTC" firstStartedPulling="2026-04-21 15:38:06.519239736 +0000 UTC m=+162.091272657" lastFinishedPulling="2026-04-21 15:38:08.492331761 +0000 UTC m=+164.064364682" observedRunningTime="2026-04-21 15:38:08.646279519 +0000 UTC m=+164.218312506" watchObservedRunningTime="2026-04-21 15:38:08.647017197 +0000 UTC m=+164.219050140" Apr 21 15:38:09.615889 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:09.615855 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gdwg8" event={"ID":"cb43fce4-df4e-4cca-a455-90d323512faf","Type":"ContainerStarted","Data":"2fa77c5dba5effbc0014c193ee75d6332f5b85d1eefe715a9cab6ca9bc37e3c6"} Apr 21 15:38:09.616324 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:09.615960 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gdwg8" Apr 21 15:38:15.233753 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.233692 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gdwg8" podStartSLOduration=135.282272716 podStartE2EDuration="2m17.233676291s" podCreationTimestamp="2026-04-21 15:35:58 +0000 UTC" firstStartedPulling="2026-04-21 15:38:06.537687566 +0000 UTC m=+162.109720491" lastFinishedPulling="2026-04-21 15:38:08.48909113 +0000 UTC m=+164.061124066" observedRunningTime="2026-04-21 15:38:09.641309195 +0000 UTC m=+165.213342138" watchObservedRunningTime="2026-04-21 15:38:15.233676291 +0000 UTC m=+170.805709245" Apr 21 15:38:15.234787 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.234768 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-7vrxq"] Apr 21 15:38:15.237895 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.237878 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7vrxq" Apr 21 15:38:15.240436 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.240413 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 15:38:15.240652 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.240621 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-v6hmp\"" Apr 21 15:38:15.240751 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.240702 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 15:38:15.248808 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.248783 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7vrxq"] Apr 21 15:38:15.313283 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.313241 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmgn\" (UniqueName: \"kubernetes.io/projected/17c163e7-de47-4efa-bea9-78d232508160-kube-api-access-vhmgn\") pod \"downloads-6bcc868b7-7vrxq\" (UID: \"17c163e7-de47-4efa-bea9-78d232508160\") " pod="openshift-console/downloads-6bcc868b7-7vrxq" Apr 21 15:38:15.414468 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.414423 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmgn\" (UniqueName: \"kubernetes.io/projected/17c163e7-de47-4efa-bea9-78d232508160-kube-api-access-vhmgn\") pod \"downloads-6bcc868b7-7vrxq\" (UID: \"17c163e7-de47-4efa-bea9-78d232508160\") " pod="openshift-console/downloads-6bcc868b7-7vrxq" Apr 21 15:38:15.427882 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.427846 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmgn\" (UniqueName: \"kubernetes.io/projected/17c163e7-de47-4efa-bea9-78d232508160-kube-api-access-vhmgn\") pod \"downloads-6bcc868b7-7vrxq\" (UID: \"17c163e7-de47-4efa-bea9-78d232508160\") " pod="openshift-console/downloads-6bcc868b7-7vrxq" Apr 21 15:38:15.547516 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.547412 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7vrxq" Apr 21 15:38:15.682392 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:15.682359 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7vrxq"] Apr 21 15:38:15.685382 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:38:15.685350 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17c163e7_de47_4efa_bea9_78d232508160.slice/crio-c843bed62408c0853719efb7726f9fb782405c4076405727b56c3698dcedd457 WatchSource:0}: Error finding container c843bed62408c0853719efb7726f9fb782405c4076405727b56c3698dcedd457: Status 404 returned error can't find the container with id c843bed62408c0853719efb7726f9fb782405c4076405727b56c3698dcedd457 Apr 21 15:38:16.641634 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:16.641601 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7vrxq" event={"ID":"17c163e7-de47-4efa-bea9-78d232508160","Type":"ContainerStarted","Data":"c843bed62408c0853719efb7726f9fb782405c4076405727b56c3698dcedd457"} Apr 21 15:38:18.050549 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:18.050511 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:38:19.620815 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:19.620778 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gdwg8" Apr 21 15:38:25.392209 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.392171 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c6944f596-9v9wx"] Apr 21 15:38:25.396862 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.396836 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.399685 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.399593 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 15:38:25.399685 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.399622 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 15:38:25.399685 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.399668 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 15:38:25.399961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.399712 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-brhfv\"" Apr 21 15:38:25.400918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.400885 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 15:38:25.400918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.400908 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 15:38:25.417199 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.415120 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c6944f596-9v9wx"] Apr 21 15:38:25.504408 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.504366 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-oauth-serving-cert\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.504629 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.504429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-service-ca\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.504629 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.504566 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-console-config\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.504629 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.504616 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-oauth-config\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.504793 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.504664 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-serving-cert\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.504793 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.504759 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prvtt\" (UniqueName: \"kubernetes.io/projected/51ec296f-351d-4806-aaa5-b062c09b93fa-kube-api-access-prvtt\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.605669 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.605633 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-oauth-serving-cert\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.605867 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.605690 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-service-ca\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.605867 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.605727 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-console-config\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.605867 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.605753 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-oauth-config\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.605867 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.605784 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-serving-cert\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.605867 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.605856 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prvtt\" (UniqueName: \"kubernetes.io/projected/51ec296f-351d-4806-aaa5-b062c09b93fa-kube-api-access-prvtt\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.606788 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.606741 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-service-ca\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.607086 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.607062 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-console-config\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.607255 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.607231 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-oauth-serving-cert\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.608609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.608584 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-oauth-config\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.608718 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.608613 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-serving-cert\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.623126 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.623095 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prvtt\" (UniqueName: \"kubernetes.io/projected/51ec296f-351d-4806-aaa5-b062c09b93fa-kube-api-access-prvtt\") pod \"console-6c6944f596-9v9wx\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:25.708249 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:25.708214 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:32.240647 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:32.240618 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c6944f596-9v9wx"] Apr 21 15:38:32.244729 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:38:32.244684 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51ec296f_351d_4806_aaa5_b062c09b93fa.slice/crio-2daa4c999d3030718ded30cbcb4c9c1d3e83ea100efde3b0516bd8ec99fa3419 WatchSource:0}: Error finding container 2daa4c999d3030718ded30cbcb4c9c1d3e83ea100efde3b0516bd8ec99fa3419: Status 404 returned error can't find the container with id 2daa4c999d3030718ded30cbcb4c9c1d3e83ea100efde3b0516bd8ec99fa3419 Apr 21 15:38:32.689820 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:32.689778 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7vrxq" event={"ID":"17c163e7-de47-4efa-bea9-78d232508160","Type":"ContainerStarted","Data":"f6c2349f7bd207d320a4b58b566283c8ed7a2126d3683b82480202a6b8978af6"} Apr 21 15:38:32.690002 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:32.689951 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-7vrxq" Apr 21 15:38:32.690977 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:32.690940 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c6944f596-9v9wx" event={"ID":"51ec296f-351d-4806-aaa5-b062c09b93fa","Type":"ContainerStarted","Data":"2daa4c999d3030718ded30cbcb4c9c1d3e83ea100efde3b0516bd8ec99fa3419"} Apr 21 15:38:32.706064 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:32.706029 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-7vrxq" Apr 21 15:38:32.713422 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:32.713352 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-7vrxq" podStartSLOduration=1.215028091 podStartE2EDuration="17.713333609s" podCreationTimestamp="2026-04-21 15:38:15 +0000 UTC" firstStartedPulling="2026-04-21 15:38:15.687599854 +0000 UTC m=+171.259632779" lastFinishedPulling="2026-04-21 15:38:32.185905369 +0000 UTC m=+187.757938297" observedRunningTime="2026-04-21 15:38:32.712942743 +0000 UTC m=+188.284975711" watchObservedRunningTime="2026-04-21 15:38:32.713333609 +0000 UTC m=+188.285366554" Apr 21 15:38:33.938041 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:33.937562 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86c78684d4-llkv7"] Apr 21 15:38:33.942111 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:33.942080 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:33.953375 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:33.952604 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 15:38:33.954467 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:33.954435 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86c78684d4-llkv7"] Apr 21 15:38:34.086709 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.086604 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-console-config\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.086947 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.086718 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-trusted-ca-bundle\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.086947 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.086771 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-oauth-serving-cert\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.086947 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.086802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-serving-cert\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.086947 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.086866 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-service-ca\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.086947 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.086931 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-oauth-config\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.087238 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.086991 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvvrd\" (UniqueName: \"kubernetes.io/projected/c3b8617f-dec2-4795-85bb-f49984b37c08-kube-api-access-jvvrd\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.188983 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.188212 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-console-config\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.188983 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.188291 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-trusted-ca-bundle\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.188983 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.188323 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-oauth-serving-cert\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.188983 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.188352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-serving-cert\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.188983 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.188377 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-service-ca\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.188983 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.188422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-oauth-config\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.188983 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.188470 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvvrd\" (UniqueName: \"kubernetes.io/projected/c3b8617f-dec2-4795-85bb-f49984b37c08-kube-api-access-jvvrd\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.189459 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.189313 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-console-config\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.189784 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.189758 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-oauth-serving-cert\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.189868 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.189780 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-trusted-ca-bundle\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.190314 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.190289 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-service-ca\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.191717 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.191672 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-serving-cert\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.192589 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.192539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-oauth-config\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.198806 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.198772 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvvrd\" (UniqueName: \"kubernetes.io/projected/c3b8617f-dec2-4795-85bb-f49984b37c08-kube-api-access-jvvrd\") pod \"console-86c78684d4-llkv7\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.255931 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.255890 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:34.419496 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.419419 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86c78684d4-llkv7"] Apr 21 15:38:34.423272 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:38:34.423236 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3b8617f_dec2_4795_85bb_f49984b37c08.slice/crio-0c29ecf315cc3cfecb19135667a0b8ae9212dbd76737f5e2fc216172ed6dfd9a WatchSource:0}: Error finding container 0c29ecf315cc3cfecb19135667a0b8ae9212dbd76737f5e2fc216172ed6dfd9a: Status 404 returned error can't find the container with id 0c29ecf315cc3cfecb19135667a0b8ae9212dbd76737f5e2fc216172ed6dfd9a Apr 21 15:38:34.700386 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:34.700340 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c78684d4-llkv7" event={"ID":"c3b8617f-dec2-4795-85bb-f49984b37c08","Type":"ContainerStarted","Data":"0c29ecf315cc3cfecb19135667a0b8ae9212dbd76737f5e2fc216172ed6dfd9a"} Apr 21 15:38:36.708985 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:36.708945 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c6944f596-9v9wx" event={"ID":"51ec296f-351d-4806-aaa5-b062c09b93fa","Type":"ContainerStarted","Data":"f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524"} Apr 21 15:38:36.710458 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:36.710425 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c78684d4-llkv7" event={"ID":"c3b8617f-dec2-4795-85bb-f49984b37c08","Type":"ContainerStarted","Data":"2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b"} Apr 21 15:38:36.740106 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:36.740050 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c6944f596-9v9wx" podStartSLOduration=7.698666675 podStartE2EDuration="11.740030931s" podCreationTimestamp="2026-04-21 15:38:25 +0000 UTC" firstStartedPulling="2026-04-21 15:38:32.247385706 +0000 UTC m=+187.819418640" lastFinishedPulling="2026-04-21 15:38:36.288749961 +0000 UTC m=+191.860782896" observedRunningTime="2026-04-21 15:38:36.738621185 +0000 UTC m=+192.310654161" watchObservedRunningTime="2026-04-21 15:38:36.740030931 +0000 UTC m=+192.312063884" Apr 21 15:38:36.770223 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:36.770140 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86c78684d4-llkv7" podStartSLOduration=1.906113606 podStartE2EDuration="3.770119476s" podCreationTimestamp="2026-04-21 15:38:33 +0000 UTC" firstStartedPulling="2026-04-21 15:38:34.425679313 +0000 UTC m=+189.997712237" lastFinishedPulling="2026-04-21 15:38:36.289685171 +0000 UTC m=+191.861718107" observedRunningTime="2026-04-21 15:38:36.769576478 +0000 UTC m=+192.341609422" watchObservedRunningTime="2026-04-21 15:38:36.770119476 +0000 UTC m=+192.342152420" Apr 21 15:38:42.731223 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:42.731190 2569 generic.go:358] "Generic (PLEG): container finished" podID="f4e9ee47-7719-418f-90bf-ada2af6eab08" containerID="ff73da9c914d8b48ab62707b6ccbcaf1dc60519ac06589b93e48c5deea0c39d6" exitCode=0 Apr 21 15:38:42.731694 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:42.731257 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" event={"ID":"f4e9ee47-7719-418f-90bf-ada2af6eab08","Type":"ContainerDied","Data":"ff73da9c914d8b48ab62707b6ccbcaf1dc60519ac06589b93e48c5deea0c39d6"} Apr 21 15:38:42.731694 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:42.731646 2569 scope.go:117] "RemoveContainer" containerID="ff73da9c914d8b48ab62707b6ccbcaf1dc60519ac06589b93e48c5deea0c39d6" Apr 21 15:38:42.732687 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:42.732663 2569 generic.go:358] "Generic (PLEG): container finished" podID="9b8e8467-e89b-4cd1-b772-d34e71416962" containerID="5f6b441fe8aa508a2cf3f8696bccecb0fe355dbb04922c3466db54146cc8d25b" exitCode=0 Apr 21 15:38:42.732778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:42.732733 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mbdj9" event={"ID":"9b8e8467-e89b-4cd1-b772-d34e71416962","Type":"ContainerDied","Data":"5f6b441fe8aa508a2cf3f8696bccecb0fe355dbb04922c3466db54146cc8d25b"} Apr 21 15:38:42.733027 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:42.733013 2569 scope.go:117] "RemoveContainer" containerID="5f6b441fe8aa508a2cf3f8696bccecb0fe355dbb04922c3466db54146cc8d25b" Apr 21 15:38:43.737192 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:43.737160 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qdkns" event={"ID":"f4e9ee47-7719-418f-90bf-ada2af6eab08","Type":"ContainerStarted","Data":"999e3e285f7b8650965f89085746c5dea37b283a9b10c106a095c29f2ac9b726"} Apr 21 15:38:43.738780 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:43.738760 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mbdj9" event={"ID":"9b8e8467-e89b-4cd1-b772-d34e71416962","Type":"ContainerStarted","Data":"df2f71e5e0d2f133e57b00f11e6cfec3d438182cb528ec8fa5c03ef161c91f37"} Apr 21 15:38:44.256404 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:44.256370 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:44.256605 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:44.256461 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:44.261210 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:44.261178 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:44.745978 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:44.745950 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:38:44.827169 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:44.827140 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c6944f596-9v9wx"] Apr 21 15:38:45.708728 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:45.708689 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:38:46.444928 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:46.444899 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s64zs_cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c/init-textfile/0.log" Apr 21 15:38:46.647406 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:46.647377 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s64zs_cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c/node-exporter/0.log" Apr 21 15:38:46.851410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:46.851331 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s64zs_cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c/kube-rbac-proxy/0.log" Apr 21 15:38:47.645846 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:47.645822 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-s9g9g_f42786e0-4a65-43c9-a735-126e01bd577a/kube-rbac-proxy-main/0.log" Apr 21 15:38:47.847177 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:47.847144 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-s9g9g_f42786e0-4a65-43c9-a735-126e01bd577a/kube-rbac-proxy-self/0.log" Apr 21 15:38:48.046359 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:48.046321 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-s9g9g_f42786e0-4a65-43c9-a735-126e01bd577a/openshift-state-metrics/0.log" Apr 21 15:38:49.648004 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:49.647969 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rmh8w_110cb247-6c78-4666-aba6-4a6ac658c728/prometheus-operator/0.log" Apr 21 15:38:49.846297 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:49.846258 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rmh8w_110cb247-6c78-4666-aba6-4a6ac658c728/kube-rbac-proxy/0.log" Apr 21 15:38:50.045796 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:50.045770 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-522f8_1e991f09-4d79-4d2a-9195-adc43e8fcfc4/prometheus-operator-admission-webhook/0.log" Apr 21 15:38:51.445332 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:51.445308 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-qvkhd_54dec317-f338-45be-84df-5116ae87636c/networking-console-plugin/0.log" Apr 21 15:38:52.045606 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:52.045567 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c6944f596-9v9wx_51ec296f-351d-4806-aaa5-b062c09b93fa/console/0.log" Apr 21 15:38:52.247311 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:52.247284 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86c78684d4-llkv7_c3b8617f-dec2-4795-85bb-f49984b37c08/console/0.log" Apr 21 15:38:52.449191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:52.449146 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-7vrxq_17c163e7-de47-4efa-bea9-78d232508160/download-server/0.log" Apr 21 15:38:53.769356 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:53.769321 2569 generic.go:358] "Generic (PLEG): container finished" podID="53a049f2-3e84-402e-91d3-5911909aa995" containerID="5af54804d171ec5bdb52c46a9aba341b6d40d56db78a71a51eb3cd9a8d0ebf69" exitCode=0 Apr 21 15:38:53.769867 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:53.769393 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" event={"ID":"53a049f2-3e84-402e-91d3-5911909aa995","Type":"ContainerDied","Data":"5af54804d171ec5bdb52c46a9aba341b6d40d56db78a71a51eb3cd9a8d0ebf69"} Apr 21 15:38:53.769867 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:53.769757 2569 scope.go:117] "RemoveContainer" containerID="5af54804d171ec5bdb52c46a9aba341b6d40d56db78a71a51eb3cd9a8d0ebf69" Apr 21 15:38:54.775518 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:38:54.774701 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dhpnj" event={"ID":"53a049f2-3e84-402e-91d3-5911909aa995","Type":"ContainerStarted","Data":"9205ecf5fd61511a7a12b42d3290b65a4434ac0245d488f02ac30188c453e711"} Apr 21 15:39:09.846661 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:09.846621 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c6944f596-9v9wx" podUID="51ec296f-351d-4806-aaa5-b062c09b93fa" containerName="console" containerID="cri-o://f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524" gracePeriod=15 Apr 21 15:39:10.118611 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.118584 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c6944f596-9v9wx_51ec296f-351d-4806-aaa5-b062c09b93fa/console/0.log" Apr 21 15:39:10.118754 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.118675 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:39:10.214223 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.214185 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-console-config\") pod \"51ec296f-351d-4806-aaa5-b062c09b93fa\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " Apr 21 15:39:10.214412 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.214242 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-oauth-serving-cert\") pod \"51ec296f-351d-4806-aaa5-b062c09b93fa\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " Apr 21 15:39:10.214412 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.214275 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-service-ca\") pod \"51ec296f-351d-4806-aaa5-b062c09b93fa\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " Apr 21 15:39:10.214412 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.214291 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prvtt\" (UniqueName: \"kubernetes.io/projected/51ec296f-351d-4806-aaa5-b062c09b93fa-kube-api-access-prvtt\") pod \"51ec296f-351d-4806-aaa5-b062c09b93fa\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " Apr 21 15:39:10.214412 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.214318 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-serving-cert\") pod \"51ec296f-351d-4806-aaa5-b062c09b93fa\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " Apr 21 15:39:10.214412 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.214350 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-oauth-config\") pod \"51ec296f-351d-4806-aaa5-b062c09b93fa\" (UID: \"51ec296f-351d-4806-aaa5-b062c09b93fa\") " Apr 21 15:39:10.214739 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.214703 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "51ec296f-351d-4806-aaa5-b062c09b93fa" (UID: "51ec296f-351d-4806-aaa5-b062c09b93fa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:39:10.215018 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.214712 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-console-config" (OuterVolumeSpecName: "console-config") pod "51ec296f-351d-4806-aaa5-b062c09b93fa" (UID: "51ec296f-351d-4806-aaa5-b062c09b93fa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:39:10.215130 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.215024 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-service-ca" (OuterVolumeSpecName: "service-ca") pod "51ec296f-351d-4806-aaa5-b062c09b93fa" (UID: "51ec296f-351d-4806-aaa5-b062c09b93fa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:39:10.216868 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.216837 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "51ec296f-351d-4806-aaa5-b062c09b93fa" (UID: "51ec296f-351d-4806-aaa5-b062c09b93fa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:10.216947 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.216884 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ec296f-351d-4806-aaa5-b062c09b93fa-kube-api-access-prvtt" (OuterVolumeSpecName: "kube-api-access-prvtt") pod "51ec296f-351d-4806-aaa5-b062c09b93fa" (UID: "51ec296f-351d-4806-aaa5-b062c09b93fa"). InnerVolumeSpecName "kube-api-access-prvtt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:39:10.216947 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.216911 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "51ec296f-351d-4806-aaa5-b062c09b93fa" (UID: "51ec296f-351d-4806-aaa5-b062c09b93fa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:39:10.315666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.315632 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-oauth-serving-cert\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:39:10.315666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.315663 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-service-ca\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:39:10.315666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.315673 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-prvtt\" (UniqueName: \"kubernetes.io/projected/51ec296f-351d-4806-aaa5-b062c09b93fa-kube-api-access-prvtt\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:39:10.315892 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.315682 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-serving-cert\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:39:10.315892 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.315691 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51ec296f-351d-4806-aaa5-b062c09b93fa-console-oauth-config\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:39:10.315892 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.315700 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51ec296f-351d-4806-aaa5-b062c09b93fa-console-config\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:39:10.828877 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.828850 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c6944f596-9v9wx_51ec296f-351d-4806-aaa5-b062c09b93fa/console/0.log" Apr 21 15:39:10.829110 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.828888 2569 generic.go:358] "Generic (PLEG): container finished" podID="51ec296f-351d-4806-aaa5-b062c09b93fa" containerID="f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524" exitCode=2 Apr 21 15:39:10.829110 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.828963 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c6944f596-9v9wx" event={"ID":"51ec296f-351d-4806-aaa5-b062c09b93fa","Type":"ContainerDied","Data":"f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524"} Apr 21 15:39:10.829110 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.828989 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c6944f596-9v9wx" event={"ID":"51ec296f-351d-4806-aaa5-b062c09b93fa","Type":"ContainerDied","Data":"2daa4c999d3030718ded30cbcb4c9c1d3e83ea100efde3b0516bd8ec99fa3419"} Apr 21 15:39:10.829110 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.828986 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c6944f596-9v9wx" Apr 21 15:39:10.829110 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.829002 2569 scope.go:117] "RemoveContainer" containerID="f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524" Apr 21 15:39:10.837246 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.837205 2569 scope.go:117] "RemoveContainer" containerID="f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524" Apr 21 15:39:10.837541 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:39:10.837510 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524\": container with ID starting with f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524 not found: ID does not exist" containerID="f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524" Apr 21 15:39:10.837632 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.837547 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524"} err="failed to get container status \"f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524\": rpc error: code = NotFound desc = could not find container \"f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524\": container with ID starting with f968e276d66aaa6589877b61c0c16b680d6a3f7e736cd6ad20f606ab661e5524 not found: ID does not exist" Apr 21 15:39:10.850993 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.850963 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c6944f596-9v9wx"] Apr 21 15:39:10.856571 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:10.856548 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c6944f596-9v9wx"] Apr 21 15:39:11.054459 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:11.054428 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ec296f-351d-4806-aaa5-b062c09b93fa" path="/var/lib/kubelet/pods/51ec296f-351d-4806-aaa5-b062c09b93fa/volumes" Apr 21 15:39:33.320197 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.320163 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5cc4d6dc79-dn4p2"] Apr 21 15:39:33.320603 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.320457 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51ec296f-351d-4806-aaa5-b062c09b93fa" containerName="console" Apr 21 15:39:33.320603 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.320469 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ec296f-351d-4806-aaa5-b062c09b93fa" containerName="console" Apr 21 15:39:33.320603 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.320548 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="51ec296f-351d-4806-aaa5-b062c09b93fa" containerName="console" Apr 21 15:39:33.324904 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.324883 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.336752 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.336726 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cc4d6dc79-dn4p2"] Apr 21 15:39:33.406568 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.406536 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-serving-cert\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.406568 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.406567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-service-ca\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.406779 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.406589 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-oauth-serving-cert\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.406779 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.406660 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-oauth-config\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.406779 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.406724 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-config\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.406779 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.406752 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sx7x\" (UniqueName: \"kubernetes.io/projected/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-kube-api-access-2sx7x\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.406926 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.406802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-trusted-ca-bundle\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.507981 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.507950 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-serving-cert\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.507981 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.507983 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-service-ca\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.508289 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.508009 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-oauth-serving-cert\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.508289 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.508037 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-oauth-config\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.508289 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.508094 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-config\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.508289 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.508134 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sx7x\" (UniqueName: \"kubernetes.io/projected/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-kube-api-access-2sx7x\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.508289 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.508196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-trusted-ca-bundle\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.508804 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.508779 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-oauth-serving-cert\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.508932 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.508841 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-service-ca\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.508932 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.508917 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-config\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.509176 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.509154 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-trusted-ca-bundle\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.510567 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.510543 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-serving-cert\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.510567 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.510557 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-oauth-config\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.517169 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.517151 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sx7x\" (UniqueName: \"kubernetes.io/projected/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-kube-api-access-2sx7x\") pod \"console-5cc4d6dc79-dn4p2\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.635283 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.635192 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:33.770529 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.770497 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cc4d6dc79-dn4p2"] Apr 21 15:39:33.773682 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:39:33.773652 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b42dfae_38bf_4d2f_a6f5_229d887c6b4a.slice/crio-fbb67373631ba48bad254a23fa1eb18563df87d9e5f9bc6a41ffb3145c531634 WatchSource:0}: Error finding container fbb67373631ba48bad254a23fa1eb18563df87d9e5f9bc6a41ffb3145c531634: Status 404 returned error can't find the container with id fbb67373631ba48bad254a23fa1eb18563df87d9e5f9bc6a41ffb3145c531634 Apr 21 15:39:33.899060 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.898979 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cc4d6dc79-dn4p2" event={"ID":"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a","Type":"ContainerStarted","Data":"a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138"} Apr 21 15:39:33.899060 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:33.899014 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cc4d6dc79-dn4p2" event={"ID":"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a","Type":"ContainerStarted","Data":"fbb67373631ba48bad254a23fa1eb18563df87d9e5f9bc6a41ffb3145c531634"} Apr 21 15:39:36.839755 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:36.839710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:39:36.842090 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:36.842069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/893ee07d-ac5e-4593-93fd-80655b690072-metrics-certs\") pod \"network-metrics-daemon-lp64c\" (UID: \"893ee07d-ac5e-4593-93fd-80655b690072\") " pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:39:36.954827 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:36.954793 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjbkh\"" Apr 21 15:39:36.962305 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:36.962286 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lp64c" Apr 21 15:39:37.087816 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:37.087772 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5cc4d6dc79-dn4p2" podStartSLOduration=4.08775617 podStartE2EDuration="4.08775617s" podCreationTimestamp="2026-04-21 15:39:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:39:33.952571565 +0000 UTC m=+249.524604507" watchObservedRunningTime="2026-04-21 15:39:37.08775617 +0000 UTC m=+252.659789151" Apr 21 15:39:37.088727 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:37.088709 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lp64c"] Apr 21 15:39:37.091526 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:39:37.091438 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod893ee07d_ac5e_4593_93fd_80655b690072.slice/crio-e575014fb9d2ac92abdd3676452c1ab0c92db9b45f977d9cb0579dc77cb90552 WatchSource:0}: Error finding container e575014fb9d2ac92abdd3676452c1ab0c92db9b45f977d9cb0579dc77cb90552: Status 404 returned error can't find the container with id e575014fb9d2ac92abdd3676452c1ab0c92db9b45f977d9cb0579dc77cb90552 Apr 21 15:39:37.911286 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:37.911245 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lp64c" event={"ID":"893ee07d-ac5e-4593-93fd-80655b690072","Type":"ContainerStarted","Data":"e575014fb9d2ac92abdd3676452c1ab0c92db9b45f977d9cb0579dc77cb90552"} Apr 21 15:39:38.915430 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:38.915394 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lp64c" event={"ID":"893ee07d-ac5e-4593-93fd-80655b690072","Type":"ContainerStarted","Data":"e495832bdc63bea8260d171396a8a602dc8265d89856092a683a1a03cafc8e4f"} Apr 21 15:39:38.915430 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:38.915429 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lp64c" event={"ID":"893ee07d-ac5e-4593-93fd-80655b690072","Type":"ContainerStarted","Data":"251d14a57cc61573ff84e0e5c9128a74cf9975fb19551554bc16e0771c149e21"} Apr 21 15:39:38.936677 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:38.936624 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lp64c" podStartSLOduration=253.013233672 podStartE2EDuration="4m13.93660877s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:39:37.093368342 +0000 UTC m=+252.665401267" lastFinishedPulling="2026-04-21 15:39:38.016743444 +0000 UTC m=+253.588776365" observedRunningTime="2026-04-21 15:39:38.934590614 +0000 UTC m=+254.506623569" watchObservedRunningTime="2026-04-21 15:39:38.93660877 +0000 UTC m=+254.508641744" Apr 21 15:39:43.635460 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:43.635364 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:43.635460 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:43.635441 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:43.639999 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:43.639973 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:43.935021 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:43.934996 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:39:44.003462 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:39:44.003428 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86c78684d4-llkv7"] Apr 21 15:40:09.023452 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.023390 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86c78684d4-llkv7" podUID="c3b8617f-dec2-4795-85bb-f49984b37c08" containerName="console" containerID="cri-o://2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b" gracePeriod=15 Apr 21 15:40:09.271567 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.271546 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86c78684d4-llkv7_c3b8617f-dec2-4795-85bb-f49984b37c08/console/0.log" Apr 21 15:40:09.271683 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.271606 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:40:09.406287 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.406200 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-service-ca\") pod \"c3b8617f-dec2-4795-85bb-f49984b37c08\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " Apr 21 15:40:09.406287 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.406254 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-console-config\") pod \"c3b8617f-dec2-4795-85bb-f49984b37c08\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " Apr 21 15:40:09.406535 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.406299 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-trusted-ca-bundle\") pod \"c3b8617f-dec2-4795-85bb-f49984b37c08\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " Apr 21 15:40:09.406535 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.406320 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-oauth-serving-cert\") pod \"c3b8617f-dec2-4795-85bb-f49984b37c08\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " Apr 21 15:40:09.406535 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.406348 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-oauth-config\") pod \"c3b8617f-dec2-4795-85bb-f49984b37c08\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " Apr 21 15:40:09.406535 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.406431 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-serving-cert\") pod \"c3b8617f-dec2-4795-85bb-f49984b37c08\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " Apr 21 15:40:09.406535 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.406499 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvvrd\" (UniqueName: \"kubernetes.io/projected/c3b8617f-dec2-4795-85bb-f49984b37c08-kube-api-access-jvvrd\") pod \"c3b8617f-dec2-4795-85bb-f49984b37c08\" (UID: \"c3b8617f-dec2-4795-85bb-f49984b37c08\") " Apr 21 15:40:09.406786 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.406761 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-console-config" (OuterVolumeSpecName: "console-config") pod "c3b8617f-dec2-4795-85bb-f49984b37c08" (UID: "c3b8617f-dec2-4795-85bb-f49984b37c08"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:40:09.406838 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.406782 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-service-ca" (OuterVolumeSpecName: "service-ca") pod "c3b8617f-dec2-4795-85bb-f49984b37c08" (UID: "c3b8617f-dec2-4795-85bb-f49984b37c08"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:40:09.406838 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.406791 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c3b8617f-dec2-4795-85bb-f49984b37c08" (UID: "c3b8617f-dec2-4795-85bb-f49984b37c08"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:40:09.406903 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.406835 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c3b8617f-dec2-4795-85bb-f49984b37c08" (UID: "c3b8617f-dec2-4795-85bb-f49984b37c08"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:40:09.408592 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.408569 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b8617f-dec2-4795-85bb-f49984b37c08-kube-api-access-jvvrd" (OuterVolumeSpecName: "kube-api-access-jvvrd") pod "c3b8617f-dec2-4795-85bb-f49984b37c08" (UID: "c3b8617f-dec2-4795-85bb-f49984b37c08"). InnerVolumeSpecName "kube-api-access-jvvrd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:40:09.408694 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.408676 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c3b8617f-dec2-4795-85bb-f49984b37c08" (UID: "c3b8617f-dec2-4795-85bb-f49984b37c08"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:40:09.408778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.408761 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c3b8617f-dec2-4795-85bb-f49984b37c08" (UID: "c3b8617f-dec2-4795-85bb-f49984b37c08"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:40:09.507505 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.507442 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-trusted-ca-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:40:09.507505 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.507495 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-oauth-serving-cert\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:40:09.507505 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.507506 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-oauth-config\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:40:09.507505 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.507515 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3b8617f-dec2-4795-85bb-f49984b37c08-console-serving-cert\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:40:09.507757 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.507524 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jvvrd\" (UniqueName: \"kubernetes.io/projected/c3b8617f-dec2-4795-85bb-f49984b37c08-kube-api-access-jvvrd\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:40:09.507757 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.507534 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-service-ca\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:40:09.507757 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:09.507543 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3b8617f-dec2-4795-85bb-f49984b37c08-console-config\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:40:10.006656 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:10.006630 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86c78684d4-llkv7_c3b8617f-dec2-4795-85bb-f49984b37c08/console/0.log" Apr 21 15:40:10.006824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:10.006667 2569 generic.go:358] "Generic (PLEG): container finished" podID="c3b8617f-dec2-4795-85bb-f49984b37c08" containerID="2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b" exitCode=2 Apr 21 15:40:10.006824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:10.006737 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c78684d4-llkv7" Apr 21 15:40:10.006824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:10.006753 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c78684d4-llkv7" event={"ID":"c3b8617f-dec2-4795-85bb-f49984b37c08","Type":"ContainerDied","Data":"2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b"} Apr 21 15:40:10.006824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:10.006780 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c78684d4-llkv7" event={"ID":"c3b8617f-dec2-4795-85bb-f49984b37c08","Type":"ContainerDied","Data":"0c29ecf315cc3cfecb19135667a0b8ae9212dbd76737f5e2fc216172ed6dfd9a"} Apr 21 15:40:10.006824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:10.006796 2569 scope.go:117] "RemoveContainer" containerID="2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b" Apr 21 15:40:10.015374 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:10.015354 2569 scope.go:117] "RemoveContainer" containerID="2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b" Apr 21 15:40:10.015651 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:40:10.015631 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b\": container with ID starting with 2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b not found: ID does not exist" containerID="2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b" Apr 21 15:40:10.015706 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:10.015662 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b"} err="failed to get container status \"2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b\": rpc error: code = NotFound desc = could not find container \"2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b\": container with ID starting with 2e4dd04b414ac6d18a1c40bd0d8533c1e44b7272b8d0af0f440b35442465525b not found: ID does not exist" Apr 21 15:40:10.032285 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:10.032249 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86c78684d4-llkv7"] Apr 21 15:40:10.036370 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:10.036345 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86c78684d4-llkv7"] Apr 21 15:40:11.053469 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:11.053427 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b8617f-dec2-4795-85bb-f49984b37c08" path="/var/lib/kubelet/pods/c3b8617f-dec2-4795-85bb-f49984b37c08/volumes" Apr 21 15:40:24.930084 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:24.930051 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:40:24.930782 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:24.930530 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:40:24.936805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:24.936788 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 15:40:42.254962 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.254924 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8575f75ff6-hjp2t"] Apr 21 15:40:42.257492 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.255361 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3b8617f-dec2-4795-85bb-f49984b37c08" containerName="console" Apr 21 15:40:42.257492 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.255377 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b8617f-dec2-4795-85bb-f49984b37c08" containerName="console" Apr 21 15:40:42.257492 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.255428 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3b8617f-dec2-4795-85bb-f49984b37c08" containerName="console" Apr 21 15:40:42.258303 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.258275 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.271565 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.271534 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8575f75ff6-hjp2t"] Apr 21 15:40:42.354539 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.354500 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-config\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.354713 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.354548 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-trusted-ca-bundle\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.354713 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.354575 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-serving-cert\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.354713 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.354597 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-oauth-config\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.354713 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.354612 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-oauth-serving-cert\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.354713 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.354634 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-service-ca\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.354884 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.354725 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvt5q\" (UniqueName: \"kubernetes.io/projected/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-kube-api-access-pvt5q\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.456112 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.456081 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-oauth-config\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.456112 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.456116 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-oauth-serving-cert\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.456328 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.456137 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-service-ca\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.456328 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.456187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvt5q\" (UniqueName: \"kubernetes.io/projected/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-kube-api-access-pvt5q\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.456328 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.456205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-config\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.456328 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.456223 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-trusted-ca-bundle\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.456328 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.456248 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-serving-cert\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.456976 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.456914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-oauth-serving-cert\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.456976 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.456964 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-service-ca\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.457148 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.457086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-trusted-ca-bundle\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.457184 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.457146 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-config\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.458743 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.458725 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-oauth-config\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.458837 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.458820 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-serving-cert\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.467485 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.467457 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvt5q\" (UniqueName: \"kubernetes.io/projected/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-kube-api-access-pvt5q\") pod \"console-8575f75ff6-hjp2t\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.568122 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.568015 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:42.707359 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.707333 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8575f75ff6-hjp2t"] Apr 21 15:40:42.707493 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:40:42.707454 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b53dc26_94b9_4fe8_9ac6_40239830cc3c.slice/crio-25a85d774b329c1a8808ce3b7ad564ee1e3981f9f74f59bd08b9602546a973f3 WatchSource:0}: Error finding container 25a85d774b329c1a8808ce3b7ad564ee1e3981f9f74f59bd08b9602546a973f3: Status 404 returned error can't find the container with id 25a85d774b329c1a8808ce3b7ad564ee1e3981f9f74f59bd08b9602546a973f3 Apr 21 15:40:42.709307 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:42.709292 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:40:43.102929 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:43.102894 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8575f75ff6-hjp2t" event={"ID":"0b53dc26-94b9-4fe8-9ac6-40239830cc3c","Type":"ContainerStarted","Data":"a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f"} Apr 21 15:40:43.102929 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:43.102928 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8575f75ff6-hjp2t" event={"ID":"0b53dc26-94b9-4fe8-9ac6-40239830cc3c","Type":"ContainerStarted","Data":"25a85d774b329c1a8808ce3b7ad564ee1e3981f9f74f59bd08b9602546a973f3"} Apr 21 15:40:43.133677 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:43.133630 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8575f75ff6-hjp2t" podStartSLOduration=1.133617424 podStartE2EDuration="1.133617424s" podCreationTimestamp="2026-04-21 15:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:40:43.133234541 +0000 UTC m=+318.705267484" watchObservedRunningTime="2026-04-21 15:40:43.133617424 +0000 UTC m=+318.705650367" Apr 21 15:40:52.568832 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:52.568792 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:52.568832 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:52.568842 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:52.573719 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:52.573690 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:53.133958 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:53.133922 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:40:53.193148 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:40:53.193115 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cc4d6dc79-dn4p2"] Apr 21 15:41:18.213403 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.213351 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5cc4d6dc79-dn4p2" podUID="7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" containerName="console" containerID="cri-o://a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138" gracePeriod=15 Apr 21 15:41:18.445581 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.445558 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cc4d6dc79-dn4p2_7b42dfae-38bf-4d2f-a6f5-229d887c6b4a/console/0.log" Apr 21 15:41:18.445715 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.445621 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:41:18.550308 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550221 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-oauth-config\") pod \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " Apr 21 15:41:18.550451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550333 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-config\") pod \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " Apr 21 15:41:18.550451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550379 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-serving-cert\") pod \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " Apr 21 15:41:18.550451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550404 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-oauth-serving-cert\") pod \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " Apr 21 15:41:18.550451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550428 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sx7x\" (UniqueName: \"kubernetes.io/projected/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-kube-api-access-2sx7x\") pod \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " Apr 21 15:41:18.550629 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550534 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-service-ca\") pod \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " Apr 21 15:41:18.550681 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550659 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-trusted-ca-bundle\") pod \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\" (UID: \"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a\") " Apr 21 15:41:18.550851 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550817 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" (UID: "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:41:18.550851 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550838 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-config" (OuterVolumeSpecName: "console-config") pod "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" (UID: "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:41:18.551012 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550918 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-service-ca" (OuterVolumeSpecName: "service-ca") pod "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" (UID: "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:41:18.551012 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550957 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-config\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:41:18.551012 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.550972 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-oauth-serving-cert\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:41:18.551214 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.551188 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" (UID: "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:41:18.552510 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.552465 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" (UID: "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:41:18.552730 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.552707 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-kube-api-access-2sx7x" (OuterVolumeSpecName: "kube-api-access-2sx7x") pod "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" (UID: "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a"). InnerVolumeSpecName "kube-api-access-2sx7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:41:18.552780 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.552764 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" (UID: "7b42dfae-38bf-4d2f-a6f5-229d887c6b4a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:41:18.652398 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.652359 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-oauth-config\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:41:18.652398 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.652388 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-console-serving-cert\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:41:18.652398 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.652399 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2sx7x\" (UniqueName: \"kubernetes.io/projected/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-kube-api-access-2sx7x\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:41:18.652398 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.652409 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-service-ca\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:41:18.652672 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:18.652418 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a-trusted-ca-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:41:19.204396 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:19.204368 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cc4d6dc79-dn4p2_7b42dfae-38bf-4d2f-a6f5-229d887c6b4a/console/0.log" Apr 21 15:41:19.204570 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:19.204412 2569 generic.go:358] "Generic (PLEG): container finished" podID="7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" containerID="a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138" exitCode=2 Apr 21 15:41:19.204570 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:19.204506 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cc4d6dc79-dn4p2" Apr 21 15:41:19.204570 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:19.204507 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cc4d6dc79-dn4p2" event={"ID":"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a","Type":"ContainerDied","Data":"a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138"} Apr 21 15:41:19.204570 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:19.204547 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cc4d6dc79-dn4p2" event={"ID":"7b42dfae-38bf-4d2f-a6f5-229d887c6b4a","Type":"ContainerDied","Data":"fbb67373631ba48bad254a23fa1eb18563df87d9e5f9bc6a41ffb3145c531634"} Apr 21 15:41:19.204570 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:19.204562 2569 scope.go:117] "RemoveContainer" containerID="a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138" Apr 21 15:41:19.212391 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:19.212359 2569 scope.go:117] "RemoveContainer" containerID="a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138" Apr 21 15:41:19.212685 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:41:19.212663 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138\": container with ID starting with a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138 not found: ID does not exist" containerID="a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138" Apr 21 15:41:19.212752 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:19.212699 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138"} err="failed to get container status \"a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138\": rpc error: code = NotFound desc = could not find container \"a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138\": container with ID starting with a013a5f52c4cdf97ec2bbccfce67fa882254340a95147577c820139f50f56138 not found: ID does not exist" Apr 21 15:41:19.237266 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:19.237233 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cc4d6dc79-dn4p2"] Apr 21 15:41:19.247887 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:19.247861 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5cc4d6dc79-dn4p2"] Apr 21 15:41:21.053630 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:41:21.053594 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" path="/var/lib/kubelet/pods/7b42dfae-38bf-4d2f-a6f5-229d887c6b4a/volumes" Apr 21 15:42:03.830374 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.830340 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f"] Apr 21 15:42:03.830814 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.830648 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" containerName="console" Apr 21 15:42:03.830814 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.830659 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" containerName="console" Apr 21 15:42:03.830814 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.830717 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b42dfae-38bf-4d2f-a6f5-229d887c6b4a" containerName="console" Apr 21 15:42:03.833829 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.833811 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:03.837027 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.837007 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:42:03.837229 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.837216 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q22xw\"" Apr 21 15:42:03.838038 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.838019 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:42:03.847510 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.847455 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f"] Apr 21 15:42:03.917629 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.917596 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klqdj\" (UniqueName: \"kubernetes.io/projected/5811eb31-9870-4d37-af9b-8232d64f29a4-kube-api-access-klqdj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:03.917805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.917654 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:03.917805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:03.917681 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:04.018988 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:04.018948 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klqdj\" (UniqueName: \"kubernetes.io/projected/5811eb31-9870-4d37-af9b-8232d64f29a4-kube-api-access-klqdj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:04.019159 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:04.019037 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:04.019159 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:04.019081 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:04.019431 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:04.019411 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:04.019431 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:04.019421 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:04.037236 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:04.037209 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klqdj\" (UniqueName: \"kubernetes.io/projected/5811eb31-9870-4d37-af9b-8232d64f29a4-kube-api-access-klqdj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:04.143193 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:04.143106 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:04.272415 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:04.272392 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f"] Apr 21 15:42:04.274907 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:42:04.274878 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5811eb31_9870_4d37_af9b_8232d64f29a4.slice/crio-9f56250cab12098e68943091c5e1b90ec5cf2ad0e932842bb39acbfa97a60e20 WatchSource:0}: Error finding container 9f56250cab12098e68943091c5e1b90ec5cf2ad0e932842bb39acbfa97a60e20: Status 404 returned error can't find the container with id 9f56250cab12098e68943091c5e1b90ec5cf2ad0e932842bb39acbfa97a60e20 Apr 21 15:42:04.335512 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:04.335468 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" event={"ID":"5811eb31-9870-4d37-af9b-8232d64f29a4","Type":"ContainerStarted","Data":"9f56250cab12098e68943091c5e1b90ec5cf2ad0e932842bb39acbfa97a60e20"} Apr 21 15:42:09.352009 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:09.351913 2569 generic.go:358] "Generic (PLEG): container finished" podID="5811eb31-9870-4d37-af9b-8232d64f29a4" containerID="499b55409f190d3b13021becb1c1f4ee8877723f5663c28ef2ca4f34a7ba24e8" exitCode=0 Apr 21 15:42:09.352009 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:09.351972 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" event={"ID":"5811eb31-9870-4d37-af9b-8232d64f29a4","Type":"ContainerDied","Data":"499b55409f190d3b13021becb1c1f4ee8877723f5663c28ef2ca4f34a7ba24e8"} Apr 21 15:42:12.362132 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:12.362091 2569 generic.go:358] "Generic (PLEG): container finished" podID="5811eb31-9870-4d37-af9b-8232d64f29a4" containerID="0be0b6b517f0f4333c96ae8403a2e18803ac093bd2e24bc989508f7c0becdb04" exitCode=0 Apr 21 15:42:12.362552 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:12.362172 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" event={"ID":"5811eb31-9870-4d37-af9b-8232d64f29a4","Type":"ContainerDied","Data":"0be0b6b517f0f4333c96ae8403a2e18803ac093bd2e24bc989508f7c0becdb04"} Apr 21 15:42:19.387591 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:19.387496 2569 generic.go:358] "Generic (PLEG): container finished" podID="5811eb31-9870-4d37-af9b-8232d64f29a4" containerID="251b608a78a50cc79fefdb9fc3fa739aabac4e65a1790b5287904ba7dfb0cc0d" exitCode=0 Apr 21 15:42:19.387591 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:19.387508 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" event={"ID":"5811eb31-9870-4d37-af9b-8232d64f29a4","Type":"ContainerDied","Data":"251b608a78a50cc79fefdb9fc3fa739aabac4e65a1790b5287904ba7dfb0cc0d"} Apr 21 15:42:20.515748 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:20.515726 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:20.659041 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:20.658948 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-bundle\") pod \"5811eb31-9870-4d37-af9b-8232d64f29a4\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " Apr 21 15:42:20.659041 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:20.658992 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-util\") pod \"5811eb31-9870-4d37-af9b-8232d64f29a4\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " Apr 21 15:42:20.659041 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:20.659021 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klqdj\" (UniqueName: \"kubernetes.io/projected/5811eb31-9870-4d37-af9b-8232d64f29a4-kube-api-access-klqdj\") pod \"5811eb31-9870-4d37-af9b-8232d64f29a4\" (UID: \"5811eb31-9870-4d37-af9b-8232d64f29a4\") " Apr 21 15:42:20.659539 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:20.659515 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-bundle" (OuterVolumeSpecName: "bundle") pod "5811eb31-9870-4d37-af9b-8232d64f29a4" (UID: "5811eb31-9870-4d37-af9b-8232d64f29a4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:42:20.661222 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:20.661198 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5811eb31-9870-4d37-af9b-8232d64f29a4-kube-api-access-klqdj" (OuterVolumeSpecName: "kube-api-access-klqdj") pod "5811eb31-9870-4d37-af9b-8232d64f29a4" (UID: "5811eb31-9870-4d37-af9b-8232d64f29a4"). InnerVolumeSpecName "kube-api-access-klqdj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:42:20.662846 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:20.662824 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-util" (OuterVolumeSpecName: "util") pod "5811eb31-9870-4d37-af9b-8232d64f29a4" (UID: "5811eb31-9870-4d37-af9b-8232d64f29a4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:42:20.759817 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:20.759776 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:42:20.759817 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:20.759810 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5811eb31-9870-4d37-af9b-8232d64f29a4-util\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:42:20.759817 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:20.759820 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-klqdj\" (UniqueName: \"kubernetes.io/projected/5811eb31-9870-4d37-af9b-8232d64f29a4-kube-api-access-klqdj\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:42:21.395443 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:21.395363 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" event={"ID":"5811eb31-9870-4d37-af9b-8232d64f29a4","Type":"ContainerDied","Data":"9f56250cab12098e68943091c5e1b90ec5cf2ad0e932842bb39acbfa97a60e20"} Apr 21 15:42:21.395443 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:21.395384 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2w58f" Apr 21 15:42:21.395443 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:21.395396 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f56250cab12098e68943091c5e1b90ec5cf2ad0e932842bb39acbfa97a60e20" Apr 21 15:42:26.718546 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.718512 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j"] Apr 21 15:42:26.718907 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.718813 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5811eb31-9870-4d37-af9b-8232d64f29a4" containerName="pull" Apr 21 15:42:26.718907 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.718825 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5811eb31-9870-4d37-af9b-8232d64f29a4" containerName="pull" Apr 21 15:42:26.718907 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.718840 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5811eb31-9870-4d37-af9b-8232d64f29a4" containerName="util" Apr 21 15:42:26.718907 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.718846 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5811eb31-9870-4d37-af9b-8232d64f29a4" containerName="util" Apr 21 15:42:26.718907 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.718854 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5811eb31-9870-4d37-af9b-8232d64f29a4" containerName="extract" Apr 21 15:42:26.718907 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.718860 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5811eb31-9870-4d37-af9b-8232d64f29a4" containerName="extract" Apr 21 15:42:26.718907 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.718908 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5811eb31-9870-4d37-af9b-8232d64f29a4" containerName="extract" Apr 21 15:42:26.725401 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.725384 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" Apr 21 15:42:26.729856 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.729832 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 21 15:42:26.730136 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.730121 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-48j5l\"" Apr 21 15:42:26.730710 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.730687 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 21 15:42:26.730793 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.730782 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 21 15:42:26.759247 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.759219 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j"] Apr 21 15:42:26.813268 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.813153 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/66707ee1-6550-4bdd-bd28-cd20b0be97e6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89v5j\" (UID: \"66707ee1-6550-4bdd-bd28-cd20b0be97e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" Apr 21 15:42:26.813664 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.813613 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt5bv\" (UniqueName: \"kubernetes.io/projected/66707ee1-6550-4bdd-bd28-cd20b0be97e6-kube-api-access-gt5bv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89v5j\" (UID: \"66707ee1-6550-4bdd-bd28-cd20b0be97e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" Apr 21 15:42:26.914615 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.914553 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gt5bv\" (UniqueName: \"kubernetes.io/projected/66707ee1-6550-4bdd-bd28-cd20b0be97e6-kube-api-access-gt5bv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89v5j\" (UID: \"66707ee1-6550-4bdd-bd28-cd20b0be97e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" Apr 21 15:42:26.914808 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.914657 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/66707ee1-6550-4bdd-bd28-cd20b0be97e6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89v5j\" (UID: \"66707ee1-6550-4bdd-bd28-cd20b0be97e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" Apr 21 15:42:26.917046 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.917015 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/66707ee1-6550-4bdd-bd28-cd20b0be97e6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89v5j\" (UID: \"66707ee1-6550-4bdd-bd28-cd20b0be97e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" Apr 21 15:42:26.928280 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:26.928256 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt5bv\" (UniqueName: \"kubernetes.io/projected/66707ee1-6550-4bdd-bd28-cd20b0be97e6-kube-api-access-gt5bv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89v5j\" (UID: \"66707ee1-6550-4bdd-bd28-cd20b0be97e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" Apr 21 15:42:27.035634 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:27.035540 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" Apr 21 15:42:27.171118 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:27.171092 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j"] Apr 21 15:42:27.173282 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:42:27.173253 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66707ee1_6550_4bdd_bd28_cd20b0be97e6.slice/crio-ec56c11b95e934549a2417fa40f5b66a6a2729f794f6a657a14ee908ef4ca4dc WatchSource:0}: Error finding container ec56c11b95e934549a2417fa40f5b66a6a2729f794f6a657a14ee908ef4ca4dc: Status 404 returned error can't find the container with id ec56c11b95e934549a2417fa40f5b66a6a2729f794f6a657a14ee908ef4ca4dc Apr 21 15:42:27.414676 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:27.414640 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" event={"ID":"66707ee1-6550-4bdd-bd28-cd20b0be97e6","Type":"ContainerStarted","Data":"ec56c11b95e934549a2417fa40f5b66a6a2729f794f6a657a14ee908ef4ca4dc"} Apr 21 15:42:31.428910 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:31.428872 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" event={"ID":"66707ee1-6550-4bdd-bd28-cd20b0be97e6","Type":"ContainerStarted","Data":"d5c89a0e4cb626c2fdf1b0e6a4cf2c4c6b5c859c7f77791a8ea6a146503fdf77"} Apr 21 15:42:31.429337 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:31.429001 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" Apr 21 15:42:31.453223 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:31.453154 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" podStartSLOduration=1.480264208 podStartE2EDuration="5.453132177s" podCreationTimestamp="2026-04-21 15:42:26 +0000 UTC" firstStartedPulling="2026-04-21 15:42:27.175022101 +0000 UTC m=+422.747055022" lastFinishedPulling="2026-04-21 15:42:31.147890057 +0000 UTC m=+426.719922991" observedRunningTime="2026-04-21 15:42:31.452036509 +0000 UTC m=+427.024069456" watchObservedRunningTime="2026-04-21 15:42:31.453132177 +0000 UTC m=+427.025165121" Apr 21 15:42:32.411980 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:32.411945 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-bb9c4"] Apr 21 15:42:32.415318 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:32.415291 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-bb9c4" Apr 21 15:42:32.417888 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:32.417865 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 21 15:42:32.418014 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:32.417871 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-vrwtg\"" Apr 21 15:42:32.418111 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:32.418096 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 21 15:42:32.426790 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:32.426767 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-bb9c4"] Apr 21 15:42:32.561758 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:32.561715 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxzdh\" (UniqueName: \"kubernetes.io/projected/420c3fcb-5581-42ed-850b-f620ef9aec4c-kube-api-access-dxzdh\") pod \"keda-admission-cf49989db-bb9c4\" (UID: \"420c3fcb-5581-42ed-850b-f620ef9aec4c\") " pod="openshift-keda/keda-admission-cf49989db-bb9c4" Apr 21 15:42:32.562599 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:32.562318 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/420c3fcb-5581-42ed-850b-f620ef9aec4c-certificates\") pod \"keda-admission-cf49989db-bb9c4\" (UID: \"420c3fcb-5581-42ed-850b-f620ef9aec4c\") " pod="openshift-keda/keda-admission-cf49989db-bb9c4" Apr 21 15:42:32.664029 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:32.663994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/420c3fcb-5581-42ed-850b-f620ef9aec4c-certificates\") pod \"keda-admission-cf49989db-bb9c4\" (UID: \"420c3fcb-5581-42ed-850b-f620ef9aec4c\") " pod="openshift-keda/keda-admission-cf49989db-bb9c4" Apr 21 15:42:32.664217 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:32.664089 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxzdh\" (UniqueName: \"kubernetes.io/projected/420c3fcb-5581-42ed-850b-f620ef9aec4c-kube-api-access-dxzdh\") pod \"keda-admission-cf49989db-bb9c4\" (UID: \"420c3fcb-5581-42ed-850b-f620ef9aec4c\") " pod="openshift-keda/keda-admission-cf49989db-bb9c4" Apr 21 15:42:32.664217 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:42:32.664143 2569 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 21 15:42:32.664217 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:42:32.664168 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-bb9c4: secret "keda-admission-webhooks-certs" not found Apr 21 15:42:32.664353 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:42:32.664228 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/420c3fcb-5581-42ed-850b-f620ef9aec4c-certificates podName:420c3fcb-5581-42ed-850b-f620ef9aec4c nodeName:}" failed. No retries permitted until 2026-04-21 15:42:33.164209092 +0000 UTC m=+428.736242017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/420c3fcb-5581-42ed-850b-f620ef9aec4c-certificates") pod "keda-admission-cf49989db-bb9c4" (UID: "420c3fcb-5581-42ed-850b-f620ef9aec4c") : secret "keda-admission-webhooks-certs" not found Apr 21 15:42:32.678789 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:32.678753 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxzdh\" (UniqueName: \"kubernetes.io/projected/420c3fcb-5581-42ed-850b-f620ef9aec4c-kube-api-access-dxzdh\") pod \"keda-admission-cf49989db-bb9c4\" (UID: \"420c3fcb-5581-42ed-850b-f620ef9aec4c\") " pod="openshift-keda/keda-admission-cf49989db-bb9c4" Apr 21 15:42:33.168203 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:33.168173 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/420c3fcb-5581-42ed-850b-f620ef9aec4c-certificates\") pod \"keda-admission-cf49989db-bb9c4\" (UID: \"420c3fcb-5581-42ed-850b-f620ef9aec4c\") " pod="openshift-keda/keda-admission-cf49989db-bb9c4" Apr 21 15:42:33.170565 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:33.170542 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/420c3fcb-5581-42ed-850b-f620ef9aec4c-certificates\") pod \"keda-admission-cf49989db-bb9c4\" (UID: \"420c3fcb-5581-42ed-850b-f620ef9aec4c\") " pod="openshift-keda/keda-admission-cf49989db-bb9c4" Apr 21 15:42:33.325760 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:33.325729 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-bb9c4" Apr 21 15:42:33.451515 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:33.451427 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-bb9c4"] Apr 21 15:42:33.454851 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:42:33.454812 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod420c3fcb_5581_42ed_850b_f620ef9aec4c.slice/crio-c2f790ea903fcb43ed99831a063eb23a3d3cf1fd73f74ab28846d477e7b22293 WatchSource:0}: Error finding container c2f790ea903fcb43ed99831a063eb23a3d3cf1fd73f74ab28846d477e7b22293: Status 404 returned error can't find the container with id c2f790ea903fcb43ed99831a063eb23a3d3cf1fd73f74ab28846d477e7b22293 Apr 21 15:42:34.440839 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:34.440794 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-bb9c4" event={"ID":"420c3fcb-5581-42ed-850b-f620ef9aec4c","Type":"ContainerStarted","Data":"c2f790ea903fcb43ed99831a063eb23a3d3cf1fd73f74ab28846d477e7b22293"} Apr 21 15:42:35.445519 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:35.445470 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-bb9c4" event={"ID":"420c3fcb-5581-42ed-850b-f620ef9aec4c","Type":"ContainerStarted","Data":"dedbe15af8f0fc0402896b863c63832628d23a26e957c6cc14c204d7cae733d3"} Apr 21 15:42:35.445889 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:35.445761 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-bb9c4" Apr 21 15:42:35.468185 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:35.468135 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-bb9c4" podStartSLOduration=2.014246396 podStartE2EDuration="3.468122727s" podCreationTimestamp="2026-04-21 15:42:32 +0000 UTC" firstStartedPulling="2026-04-21 15:42:33.456328326 +0000 UTC m=+429.028361262" lastFinishedPulling="2026-04-21 15:42:34.910204668 +0000 UTC m=+430.482237593" observedRunningTime="2026-04-21 15:42:35.464746225 +0000 UTC m=+431.036779169" watchObservedRunningTime="2026-04-21 15:42:35.468122727 +0000 UTC m=+431.040155670" Apr 21 15:42:52.434386 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:52.434306 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89v5j" Apr 21 15:42:56.451136 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:42:56.451107 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-bb9c4" Apr 21 15:43:23.522986 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.522954 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj"] Apr 21 15:43:23.526371 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.526353 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:23.529501 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.529453 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q22xw\"" Apr 21 15:43:23.529634 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.529595 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:43:23.530590 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.530570 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:43:23.535443 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.535421 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj"] Apr 21 15:43:23.667110 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.667071 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:23.667290 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.667133 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdnn\" (UniqueName: \"kubernetes.io/projected/7606e7ba-318b-4a17-b566-258226c6b107-kube-api-access-wrdnn\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:23.667290 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.667162 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:23.767855 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.767820 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:23.768015 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.767882 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdnn\" (UniqueName: \"kubernetes.io/projected/7606e7ba-318b-4a17-b566-258226c6b107-kube-api-access-wrdnn\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:23.768015 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.767916 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:23.768232 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.768215 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:23.768304 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.768286 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:23.776859 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.776791 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdnn\" (UniqueName: \"kubernetes.io/projected/7606e7ba-318b-4a17-b566-258226c6b107-kube-api-access-wrdnn\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:23.836259 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.836222 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:23.957093 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:23.957069 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj"] Apr 21 15:43:23.960079 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:43:23.960052 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7606e7ba_318b_4a17_b566_258226c6b107.slice/crio-cdc3c790c89666127824998fc10937f8abc905158aa1149eec4530a3112a8e56 WatchSource:0}: Error finding container cdc3c790c89666127824998fc10937f8abc905158aa1149eec4530a3112a8e56: Status 404 returned error can't find the container with id cdc3c790c89666127824998fc10937f8abc905158aa1149eec4530a3112a8e56 Apr 21 15:43:24.612190 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:24.612156 2569 generic.go:358] "Generic (PLEG): container finished" podID="7606e7ba-318b-4a17-b566-258226c6b107" containerID="6022b1e2754274d2aae4902b7d768f3df39b996e54a079dd50c6b921136e99ce" exitCode=0 Apr 21 15:43:24.612604 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:24.612235 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" event={"ID":"7606e7ba-318b-4a17-b566-258226c6b107","Type":"ContainerDied","Data":"6022b1e2754274d2aae4902b7d768f3df39b996e54a079dd50c6b921136e99ce"} Apr 21 15:43:24.612604 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:24.612271 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" event={"ID":"7606e7ba-318b-4a17-b566-258226c6b107","Type":"ContainerStarted","Data":"cdc3c790c89666127824998fc10937f8abc905158aa1149eec4530a3112a8e56"} Apr 21 15:43:36.651644 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:36.651610 2569 generic.go:358] "Generic (PLEG): container finished" podID="7606e7ba-318b-4a17-b566-258226c6b107" containerID="0ac02b18e657732ab355786db10811619ce49e412b2ecee45b3a8377c17911bb" exitCode=0 Apr 21 15:43:36.652114 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:36.651698 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" event={"ID":"7606e7ba-318b-4a17-b566-258226c6b107","Type":"ContainerDied","Data":"0ac02b18e657732ab355786db10811619ce49e412b2ecee45b3a8377c17911bb"} Apr 21 15:43:37.656976 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:37.656942 2569 generic.go:358] "Generic (PLEG): container finished" podID="7606e7ba-318b-4a17-b566-258226c6b107" containerID="cd395a517512a5b09c172316df42f4ded540af3fc5ec9a6a8302198a4c8dbb3d" exitCode=0 Apr 21 15:43:37.657410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:37.657021 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" event={"ID":"7606e7ba-318b-4a17-b566-258226c6b107","Type":"ContainerDied","Data":"cd395a517512a5b09c172316df42f4ded540af3fc5ec9a6a8302198a4c8dbb3d"} Apr 21 15:43:38.782145 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:38.782122 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:38.888222 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:38.888184 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrdnn\" (UniqueName: \"kubernetes.io/projected/7606e7ba-318b-4a17-b566-258226c6b107-kube-api-access-wrdnn\") pod \"7606e7ba-318b-4a17-b566-258226c6b107\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " Apr 21 15:43:38.888407 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:38.888257 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-util\") pod \"7606e7ba-318b-4a17-b566-258226c6b107\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " Apr 21 15:43:38.888407 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:38.888291 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-bundle\") pod \"7606e7ba-318b-4a17-b566-258226c6b107\" (UID: \"7606e7ba-318b-4a17-b566-258226c6b107\") " Apr 21 15:43:38.889054 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:38.889024 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-bundle" (OuterVolumeSpecName: "bundle") pod "7606e7ba-318b-4a17-b566-258226c6b107" (UID: "7606e7ba-318b-4a17-b566-258226c6b107"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:43:38.890324 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:38.890296 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7606e7ba-318b-4a17-b566-258226c6b107-kube-api-access-wrdnn" (OuterVolumeSpecName: "kube-api-access-wrdnn") pod "7606e7ba-318b-4a17-b566-258226c6b107" (UID: "7606e7ba-318b-4a17-b566-258226c6b107"). InnerVolumeSpecName "kube-api-access-wrdnn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:43:38.894591 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:38.894565 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-util" (OuterVolumeSpecName: "util") pod "7606e7ba-318b-4a17-b566-258226c6b107" (UID: "7606e7ba-318b-4a17-b566-258226c6b107"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:43:38.989586 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:38.989546 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-util\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:43:38.989586 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:38.989581 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7606e7ba-318b-4a17-b566-258226c6b107-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:43:38.989586 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:38.989590 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wrdnn\" (UniqueName: \"kubernetes.io/projected/7606e7ba-318b-4a17-b566-258226c6b107-kube-api-access-wrdnn\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:43:39.665170 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:39.665137 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" event={"ID":"7606e7ba-318b-4a17-b566-258226c6b107","Type":"ContainerDied","Data":"cdc3c790c89666127824998fc10937f8abc905158aa1149eec4530a3112a8e56"} Apr 21 15:43:39.665170 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:39.665172 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdc3c790c89666127824998fc10937f8abc905158aa1149eec4530a3112a8e56" Apr 21 15:43:39.665370 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:39.665236 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dnq2nj" Apr 21 15:43:55.325324 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.325286 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2"] Apr 21 15:43:55.325817 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.325620 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7606e7ba-318b-4a17-b566-258226c6b107" containerName="pull" Apr 21 15:43:55.325817 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.325632 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7606e7ba-318b-4a17-b566-258226c6b107" containerName="pull" Apr 21 15:43:55.325817 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.325645 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7606e7ba-318b-4a17-b566-258226c6b107" containerName="util" Apr 21 15:43:55.325817 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.325651 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7606e7ba-318b-4a17-b566-258226c6b107" containerName="util" Apr 21 15:43:55.325817 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.325660 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7606e7ba-318b-4a17-b566-258226c6b107" containerName="extract" Apr 21 15:43:55.325817 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.325666 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7606e7ba-318b-4a17-b566-258226c6b107" containerName="extract" Apr 21 15:43:55.325817 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.325735 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7606e7ba-318b-4a17-b566-258226c6b107" containerName="extract" Apr 21 15:43:55.328358 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.328340 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:43:55.331062 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.331043 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q22xw\"" Apr 21 15:43:55.331145 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.331048 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:43:55.332068 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.332053 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:43:55.338497 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.338461 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2"] Apr 21 15:43:55.429019 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.428981 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th2wx\" (UniqueName: \"kubernetes.io/projected/9d88c676-d16e-4121-91a9-98da3d69018b-kube-api-access-th2wx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:43:55.429202 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.429037 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:43:55.429202 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.429055 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:43:55.530237 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.530201 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th2wx\" (UniqueName: \"kubernetes.io/projected/9d88c676-d16e-4121-91a9-98da3d69018b-kube-api-access-th2wx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:43:55.530445 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.530265 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:43:55.530445 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.530290 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:43:55.530672 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.530650 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:43:55.530743 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.530717 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:43:55.538816 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.538793 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th2wx\" (UniqueName: \"kubernetes.io/projected/9d88c676-d16e-4121-91a9-98da3d69018b-kube-api-access-th2wx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:43:55.637653 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.637571 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:43:55.778123 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:55.778097 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2"] Apr 21 15:43:55.780310 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:43:55.780273 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d88c676_d16e_4121_91a9_98da3d69018b.slice/crio-476abf29061fa0d8b1d4a65f2adb0a1ee779ff537bd1097a5a49a2cb51dc7d9c WatchSource:0}: Error finding container 476abf29061fa0d8b1d4a65f2adb0a1ee779ff537bd1097a5a49a2cb51dc7d9c: Status 404 returned error can't find the container with id 476abf29061fa0d8b1d4a65f2adb0a1ee779ff537bd1097a5a49a2cb51dc7d9c Apr 21 15:43:56.723896 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:56.723861 2569 generic.go:358] "Generic (PLEG): container finished" podID="9d88c676-d16e-4121-91a9-98da3d69018b" containerID="9992db52cb29b47b2dc165e33682a86ba3d514d3f127b315123975afa1e6df9f" exitCode=0 Apr 21 15:43:56.724268 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:56.723924 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" event={"ID":"9d88c676-d16e-4121-91a9-98da3d69018b","Type":"ContainerDied","Data":"9992db52cb29b47b2dc165e33682a86ba3d514d3f127b315123975afa1e6df9f"} Apr 21 15:43:56.724268 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:56.723944 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" event={"ID":"9d88c676-d16e-4121-91a9-98da3d69018b","Type":"ContainerStarted","Data":"476abf29061fa0d8b1d4a65f2adb0a1ee779ff537bd1097a5a49a2cb51dc7d9c"} Apr 21 15:43:59.736244 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:59.736207 2569 generic.go:358] "Generic (PLEG): container finished" podID="9d88c676-d16e-4121-91a9-98da3d69018b" containerID="28680ca7f9c5f0ea0eacaa00be3b8ccd73fb8c690b345779db0413aba37e6636" exitCode=0 Apr 21 15:43:59.736709 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:43:59.736298 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" event={"ID":"9d88c676-d16e-4121-91a9-98da3d69018b","Type":"ContainerDied","Data":"28680ca7f9c5f0ea0eacaa00be3b8ccd73fb8c690b345779db0413aba37e6636"} Apr 21 15:44:00.741589 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:00.741552 2569 generic.go:358] "Generic (PLEG): container finished" podID="9d88c676-d16e-4121-91a9-98da3d69018b" containerID="c509d8fd76bc3e635c3bcd86d425ea2ef6ead78fe249c6c50a98946cbcfccedb" exitCode=0 Apr 21 15:44:00.742027 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:00.741625 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" event={"ID":"9d88c676-d16e-4121-91a9-98da3d69018b","Type":"ContainerDied","Data":"c509d8fd76bc3e635c3bcd86d425ea2ef6ead78fe249c6c50a98946cbcfccedb"} Apr 21 15:44:01.876336 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:01.876311 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:44:01.973153 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:01.973121 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-util\") pod \"9d88c676-d16e-4121-91a9-98da3d69018b\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " Apr 21 15:44:01.973347 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:01.973183 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th2wx\" (UniqueName: \"kubernetes.io/projected/9d88c676-d16e-4121-91a9-98da3d69018b-kube-api-access-th2wx\") pod \"9d88c676-d16e-4121-91a9-98da3d69018b\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " Apr 21 15:44:01.973347 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:01.973207 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-bundle\") pod \"9d88c676-d16e-4121-91a9-98da3d69018b\" (UID: \"9d88c676-d16e-4121-91a9-98da3d69018b\") " Apr 21 15:44:01.973753 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:01.973703 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-bundle" (OuterVolumeSpecName: "bundle") pod "9d88c676-d16e-4121-91a9-98da3d69018b" (UID: "9d88c676-d16e-4121-91a9-98da3d69018b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:44:01.975339 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:01.975318 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d88c676-d16e-4121-91a9-98da3d69018b-kube-api-access-th2wx" (OuterVolumeSpecName: "kube-api-access-th2wx") pod "9d88c676-d16e-4121-91a9-98da3d69018b" (UID: "9d88c676-d16e-4121-91a9-98da3d69018b"). InnerVolumeSpecName "kube-api-access-th2wx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:44:01.977718 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:01.977680 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-util" (OuterVolumeSpecName: "util") pod "9d88c676-d16e-4121-91a9-98da3d69018b" (UID: "9d88c676-d16e-4121-91a9-98da3d69018b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:44:02.074183 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:02.074092 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-th2wx\" (UniqueName: \"kubernetes.io/projected/9d88c676-d16e-4121-91a9-98da3d69018b-kube-api-access-th2wx\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:44:02.074183 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:02.074127 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:44:02.074183 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:02.074137 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d88c676-d16e-4121-91a9-98da3d69018b-util\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:44:02.749954 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:02.749928 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" Apr 21 15:44:02.750118 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:02.749925 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f699r2" event={"ID":"9d88c676-d16e-4121-91a9-98da3d69018b","Type":"ContainerDied","Data":"476abf29061fa0d8b1d4a65f2adb0a1ee779ff537bd1097a5a49a2cb51dc7d9c"} Apr 21 15:44:02.750118 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:02.750031 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="476abf29061fa0d8b1d4a65f2adb0a1ee779ff537bd1097a5a49a2cb51dc7d9c" Apr 21 15:44:03.102111 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.102034 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-t2k7z"] Apr 21 15:44:03.102451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.102318 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d88c676-d16e-4121-91a9-98da3d69018b" containerName="extract" Apr 21 15:44:03.102451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.102331 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d88c676-d16e-4121-91a9-98da3d69018b" containerName="extract" Apr 21 15:44:03.102451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.102352 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d88c676-d16e-4121-91a9-98da3d69018b" containerName="util" Apr 21 15:44:03.102451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.102358 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d88c676-d16e-4121-91a9-98da3d69018b" containerName="util" Apr 21 15:44:03.102451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.102367 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d88c676-d16e-4121-91a9-98da3d69018b" containerName="pull" Apr 21 15:44:03.102451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.102372 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d88c676-d16e-4121-91a9-98da3d69018b" containerName="pull" Apr 21 15:44:03.102451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.102426 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d88c676-d16e-4121-91a9-98da3d69018b" containerName="extract" Apr 21 15:44:03.105915 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.105899 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-t2k7z" Apr 21 15:44:03.108840 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.108820 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 15:44:03.109101 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.109087 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 15:44:03.109176 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.109113 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-znfgn\"" Apr 21 15:44:03.117682 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.117658 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-t2k7z"] Apr 21 15:44:03.183917 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.183881 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c46877ea-fe42-40dc-b9bf-954d233f3f57-bound-sa-token\") pod \"cert-manager-79c8d999ff-t2k7z\" (UID: \"c46877ea-fe42-40dc-b9bf-954d233f3f57\") " pod="cert-manager/cert-manager-79c8d999ff-t2k7z" Apr 21 15:44:03.184098 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.183924 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wndbz\" (UniqueName: \"kubernetes.io/projected/c46877ea-fe42-40dc-b9bf-954d233f3f57-kube-api-access-wndbz\") pod \"cert-manager-79c8d999ff-t2k7z\" (UID: \"c46877ea-fe42-40dc-b9bf-954d233f3f57\") " pod="cert-manager/cert-manager-79c8d999ff-t2k7z" Apr 21 15:44:03.284542 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.284505 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c46877ea-fe42-40dc-b9bf-954d233f3f57-bound-sa-token\") pod \"cert-manager-79c8d999ff-t2k7z\" (UID: \"c46877ea-fe42-40dc-b9bf-954d233f3f57\") " pod="cert-manager/cert-manager-79c8d999ff-t2k7z" Apr 21 15:44:03.284677 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.284554 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wndbz\" (UniqueName: \"kubernetes.io/projected/c46877ea-fe42-40dc-b9bf-954d233f3f57-kube-api-access-wndbz\") pod \"cert-manager-79c8d999ff-t2k7z\" (UID: \"c46877ea-fe42-40dc-b9bf-954d233f3f57\") " pod="cert-manager/cert-manager-79c8d999ff-t2k7z" Apr 21 15:44:03.294114 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.294077 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c46877ea-fe42-40dc-b9bf-954d233f3f57-bound-sa-token\") pod \"cert-manager-79c8d999ff-t2k7z\" (UID: \"c46877ea-fe42-40dc-b9bf-954d233f3f57\") " pod="cert-manager/cert-manager-79c8d999ff-t2k7z" Apr 21 15:44:03.294224 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.294090 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wndbz\" (UniqueName: \"kubernetes.io/projected/c46877ea-fe42-40dc-b9bf-954d233f3f57-kube-api-access-wndbz\") pod \"cert-manager-79c8d999ff-t2k7z\" (UID: \"c46877ea-fe42-40dc-b9bf-954d233f3f57\") " pod="cert-manager/cert-manager-79c8d999ff-t2k7z" Apr 21 15:44:03.424618 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.424584 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-t2k7z" Apr 21 15:44:03.548643 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.548548 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-t2k7z"] Apr 21 15:44:03.550591 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:44:03.550562 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc46877ea_fe42_40dc_b9bf_954d233f3f57.slice/crio-a3d9f7d43fa1533fe2be3221a5bb36c1717eb54364b077f6250c1402b92a5905 WatchSource:0}: Error finding container a3d9f7d43fa1533fe2be3221a5bb36c1717eb54364b077f6250c1402b92a5905: Status 404 returned error can't find the container with id a3d9f7d43fa1533fe2be3221a5bb36c1717eb54364b077f6250c1402b92a5905 Apr 21 15:44:03.754789 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:03.754692 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-t2k7z" event={"ID":"c46877ea-fe42-40dc-b9bf-954d233f3f57","Type":"ContainerStarted","Data":"a3d9f7d43fa1533fe2be3221a5bb36c1717eb54364b077f6250c1402b92a5905"} Apr 21 15:44:06.766690 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:06.766653 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-t2k7z" event={"ID":"c46877ea-fe42-40dc-b9bf-954d233f3f57","Type":"ContainerStarted","Data":"bd5b7704d51a2e78627ca25223291aa324ce79e5abe52032f0b24b7c1c9ed05e"} Apr 21 15:44:18.814273 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:18.814220 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-t2k7z" podStartSLOduration=13.339279306 podStartE2EDuration="15.814204957s" podCreationTimestamp="2026-04-21 15:44:03 +0000 UTC" firstStartedPulling="2026-04-21 15:44:03.552347908 +0000 UTC m=+519.124380829" lastFinishedPulling="2026-04-21 15:44:06.027273543 +0000 UTC m=+521.599306480" observedRunningTime="2026-04-21 15:44:06.815183997 +0000 UTC m=+522.387216941" watchObservedRunningTime="2026-04-21 15:44:18.814204957 +0000 UTC m=+534.386237900" Apr 21 15:44:18.814687 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:18.814622 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm"] Apr 21 15:44:18.817798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:18.817782 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:18.820659 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:18.820634 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:44:18.821732 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:18.821715 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q22xw\"" Apr 21 15:44:18.821845 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:18.821749 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:44:18.824769 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:18.824745 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm"] Apr 21 15:44:18.911186 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:18.911140 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnqlt\" (UniqueName: \"kubernetes.io/projected/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-kube-api-access-gnqlt\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:18.911391 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:18.911197 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:18.911391 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:18.911331 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:19.012402 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:19.012360 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnqlt\" (UniqueName: \"kubernetes.io/projected/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-kube-api-access-gnqlt\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:19.012619 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:19.012415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:19.012619 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:19.012604 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:19.012798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:19.012765 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:19.012927 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:19.012910 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:19.023192 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:19.023168 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnqlt\" (UniqueName: \"kubernetes.io/projected/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-kube-api-access-gnqlt\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:19.127709 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:19.127626 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:19.254510 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:19.254387 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm"] Apr 21 15:44:19.257253 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:44:19.257224 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4745f433_9d49_49c3_94a8_7dba2fb4e1bf.slice/crio-b0b3db97f6a9ddf3418e219fba3f90b925f297737e7ac12b3e5ab29109b342ae WatchSource:0}: Error finding container b0b3db97f6a9ddf3418e219fba3f90b925f297737e7ac12b3e5ab29109b342ae: Status 404 returned error can't find the container with id b0b3db97f6a9ddf3418e219fba3f90b925f297737e7ac12b3e5ab29109b342ae Apr 21 15:44:19.818337 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:19.818289 2569 generic.go:358] "Generic (PLEG): container finished" podID="4745f433-9d49-49c3-94a8-7dba2fb4e1bf" containerID="748c1250c7075a3ed404723168f15be6e302d47f73972589db241ab312059171" exitCode=0 Apr 21 15:44:19.818712 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:19.818368 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" event={"ID":"4745f433-9d49-49c3-94a8-7dba2fb4e1bf","Type":"ContainerDied","Data":"748c1250c7075a3ed404723168f15be6e302d47f73972589db241ab312059171"} Apr 21 15:44:19.818712 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:19.818404 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" event={"ID":"4745f433-9d49-49c3-94a8-7dba2fb4e1bf","Type":"ContainerStarted","Data":"b0b3db97f6a9ddf3418e219fba3f90b925f297737e7ac12b3e5ab29109b342ae"} Apr 21 15:44:20.823139 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:20.823108 2569 generic.go:358] "Generic (PLEG): container finished" podID="4745f433-9d49-49c3-94a8-7dba2fb4e1bf" containerID="c8126a25e65e706e002c42a70c6f5d776277b696d95e5614d967bdeffe671eae" exitCode=0 Apr 21 15:44:20.823541 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:20.823156 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" event={"ID":"4745f433-9d49-49c3-94a8-7dba2fb4e1bf","Type":"ContainerDied","Data":"c8126a25e65e706e002c42a70c6f5d776277b696d95e5614d967bdeffe671eae"} Apr 21 15:44:21.828453 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:21.828405 2569 generic.go:358] "Generic (PLEG): container finished" podID="4745f433-9d49-49c3-94a8-7dba2fb4e1bf" containerID="72b45970f75dd031b5e4b4b540c65ed8383521831641403917274217e12c3cf8" exitCode=0 Apr 21 15:44:21.828847 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:21.828501 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" event={"ID":"4745f433-9d49-49c3-94a8-7dba2fb4e1bf","Type":"ContainerDied","Data":"72b45970f75dd031b5e4b4b540c65ed8383521831641403917274217e12c3cf8"} Apr 21 15:44:22.954794 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:22.954770 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:23.047527 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.047463 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-util\") pod \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " Apr 21 15:44:23.047696 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.047592 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnqlt\" (UniqueName: \"kubernetes.io/projected/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-kube-api-access-gnqlt\") pod \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " Apr 21 15:44:23.047696 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.047620 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-bundle\") pod \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\" (UID: \"4745f433-9d49-49c3-94a8-7dba2fb4e1bf\") " Apr 21 15:44:23.048515 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.048460 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-bundle" (OuterVolumeSpecName: "bundle") pod "4745f433-9d49-49c3-94a8-7dba2fb4e1bf" (UID: "4745f433-9d49-49c3-94a8-7dba2fb4e1bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:44:23.049745 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.049711 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-kube-api-access-gnqlt" (OuterVolumeSpecName: "kube-api-access-gnqlt") pod "4745f433-9d49-49c3-94a8-7dba2fb4e1bf" (UID: "4745f433-9d49-49c3-94a8-7dba2fb4e1bf"). InnerVolumeSpecName "kube-api-access-gnqlt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:44:23.053378 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.053353 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-util" (OuterVolumeSpecName: "util") pod "4745f433-9d49-49c3-94a8-7dba2fb4e1bf" (UID: "4745f433-9d49-49c3-94a8-7dba2fb4e1bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:44:23.148344 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.148260 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gnqlt\" (UniqueName: \"kubernetes.io/projected/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-kube-api-access-gnqlt\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:44:23.148344 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.148291 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:44:23.148344 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.148301 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4745f433-9d49-49c3-94a8-7dba2fb4e1bf-util\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:44:23.837887 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.837853 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" Apr 21 15:44:23.838064 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.837849 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jggxm" event={"ID":"4745f433-9d49-49c3-94a8-7dba2fb4e1bf","Type":"ContainerDied","Data":"b0b3db97f6a9ddf3418e219fba3f90b925f297737e7ac12b3e5ab29109b342ae"} Apr 21 15:44:23.838064 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:23.837971 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0b3db97f6a9ddf3418e219fba3f90b925f297737e7ac12b3e5ab29109b342ae" Apr 21 15:44:26.068089 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.068056 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-fb974466f-4bczq"] Apr 21 15:44:26.068467 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.068355 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4745f433-9d49-49c3-94a8-7dba2fb4e1bf" containerName="util" Apr 21 15:44:26.068467 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.068366 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4745f433-9d49-49c3-94a8-7dba2fb4e1bf" containerName="util" Apr 21 15:44:26.068467 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.068379 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4745f433-9d49-49c3-94a8-7dba2fb4e1bf" containerName="extract" Apr 21 15:44:26.068467 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.068384 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4745f433-9d49-49c3-94a8-7dba2fb4e1bf" containerName="extract" Apr 21 15:44:26.068467 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.068392 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4745f433-9d49-49c3-94a8-7dba2fb4e1bf" containerName="pull" Apr 21 15:44:26.068467 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.068397 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4745f433-9d49-49c3-94a8-7dba2fb4e1bf" containerName="pull" Apr 21 15:44:26.068467 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.068451 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4745f433-9d49-49c3-94a8-7dba2fb4e1bf" containerName="extract" Apr 21 15:44:26.072508 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.072491 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.077663 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.077634 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:44:26.078163 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.078136 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 15:44:26.078258 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.078165 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 15:44:26.078842 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.078676 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 15:44:26.078918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.078838 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5d72j\"" Apr 21 15:44:26.078918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.078906 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 15:44:26.084100 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.084077 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fb974466f-4bczq"] Apr 21 15:44:26.170676 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.170630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/831e2eae-f4ad-4a65-b953-17c2024d94c1-manager-config\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.170676 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.170679 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/831e2eae-f4ad-4a65-b953-17c2024d94c1-metrics-cert\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.170935 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.170859 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/831e2eae-f4ad-4a65-b953-17c2024d94c1-cert\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.170935 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.170925 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvhzv\" (UniqueName: \"kubernetes.io/projected/831e2eae-f4ad-4a65-b953-17c2024d94c1-kube-api-access-kvhzv\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.272182 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.272144 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/831e2eae-f4ad-4a65-b953-17c2024d94c1-cert\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.272372 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.272200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvhzv\" (UniqueName: \"kubernetes.io/projected/831e2eae-f4ad-4a65-b953-17c2024d94c1-kube-api-access-kvhzv\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.272372 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.272323 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/831e2eae-f4ad-4a65-b953-17c2024d94c1-manager-config\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.272372 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.272357 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/831e2eae-f4ad-4a65-b953-17c2024d94c1-metrics-cert\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.272956 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.272932 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/831e2eae-f4ad-4a65-b953-17c2024d94c1-manager-config\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.274676 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.274660 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/831e2eae-f4ad-4a65-b953-17c2024d94c1-cert\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.274821 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.274801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/831e2eae-f4ad-4a65-b953-17c2024d94c1-metrics-cert\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.286373 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.286343 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvhzv\" (UniqueName: \"kubernetes.io/projected/831e2eae-f4ad-4a65-b953-17c2024d94c1-kube-api-access-kvhzv\") pod \"lws-controller-manager-fb974466f-4bczq\" (UID: \"831e2eae-f4ad-4a65-b953-17c2024d94c1\") " pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.381384 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.381301 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:26.513275 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.513167 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fb974466f-4bczq"] Apr 21 15:44:26.515805 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:44:26.515764 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod831e2eae_f4ad_4a65_b953_17c2024d94c1.slice/crio-92c9c94117a58125bb9afcbd46b3606d772758ca8bc12770165de7f635b9d675 WatchSource:0}: Error finding container 92c9c94117a58125bb9afcbd46b3606d772758ca8bc12770165de7f635b9d675: Status 404 returned error can't find the container with id 92c9c94117a58125bb9afcbd46b3606d772758ca8bc12770165de7f635b9d675 Apr 21 15:44:26.850168 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:26.850134 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" event={"ID":"831e2eae-f4ad-4a65-b953-17c2024d94c1","Type":"ContainerStarted","Data":"92c9c94117a58125bb9afcbd46b3606d772758ca8bc12770165de7f635b9d675"} Apr 21 15:44:28.858094 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:28.858060 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" event={"ID":"831e2eae-f4ad-4a65-b953-17c2024d94c1","Type":"ContainerStarted","Data":"cd71eeeec90a8bba2aef3386563bdf24a1e7a0a125f3619aa5ee31c2769c8f89"} Apr 21 15:44:28.858538 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:28.858199 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:28.878459 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:28.878409 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" podStartSLOduration=1.2899532 podStartE2EDuration="2.878393059s" podCreationTimestamp="2026-04-21 15:44:26 +0000 UTC" firstStartedPulling="2026-04-21 15:44:26.517736727 +0000 UTC m=+542.089769652" lastFinishedPulling="2026-04-21 15:44:28.106176589 +0000 UTC m=+543.678209511" observedRunningTime="2026-04-21 15:44:28.877213968 +0000 UTC m=+544.449246913" watchObservedRunningTime="2026-04-21 15:44:28.878393059 +0000 UTC m=+544.450426001" Apr 21 15:44:30.718030 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.717993 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt"] Apr 21 15:44:30.721629 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.721607 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:30.725123 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.725101 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q22xw\"" Apr 21 15:44:30.725232 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.725126 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:44:30.726158 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.726142 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:44:30.749995 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.749968 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt"] Apr 21 15:44:30.812662 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.812617 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/7495da17-b5bf-41d4-8bbe-ba084f702c4c-kube-api-access-9swrr\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:30.812869 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.812675 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:30.812869 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.812787 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:30.913222 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.913182 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/7495da17-b5bf-41d4-8bbe-ba084f702c4c-kube-api-access-9swrr\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:30.913222 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.913229 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:30.913474 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.913416 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:30.913686 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.913670 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:30.913732 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.913716 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:30.927863 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:30.927835 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/7495da17-b5bf-41d4-8bbe-ba084f702c4c-kube-api-access-9swrr\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:31.029967 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:31.029879 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:31.166213 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:31.166186 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt"] Apr 21 15:44:31.168404 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:44:31.168377 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7495da17_b5bf_41d4_8bbe_ba084f702c4c.slice/crio-4fc92b4d208b8c076628deefa303f72dc931e4514ba383eab8bee1cc2ade4460 WatchSource:0}: Error finding container 4fc92b4d208b8c076628deefa303f72dc931e4514ba383eab8bee1cc2ade4460: Status 404 returned error can't find the container with id 4fc92b4d208b8c076628deefa303f72dc931e4514ba383eab8bee1cc2ade4460 Apr 21 15:44:31.870240 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:31.870205 2569 generic.go:358] "Generic (PLEG): container finished" podID="7495da17-b5bf-41d4-8bbe-ba084f702c4c" containerID="b847446b2a8e5fcfe394082a0edc06a6ae12560237469a0376257c052aea8d62" exitCode=0 Apr 21 15:44:31.870667 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:31.870271 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" event={"ID":"7495da17-b5bf-41d4-8bbe-ba084f702c4c","Type":"ContainerDied","Data":"b847446b2a8e5fcfe394082a0edc06a6ae12560237469a0376257c052aea8d62"} Apr 21 15:44:31.870667 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:31.870298 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" event={"ID":"7495da17-b5bf-41d4-8bbe-ba084f702c4c","Type":"ContainerStarted","Data":"4fc92b4d208b8c076628deefa303f72dc931e4514ba383eab8bee1cc2ade4460"} Apr 21 15:44:32.875622 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:32.875538 2569 generic.go:358] "Generic (PLEG): container finished" podID="7495da17-b5bf-41d4-8bbe-ba084f702c4c" containerID="912f07478784ed797f4584def2c2d245ebae45777d5df06208f78d48ea377078" exitCode=0 Apr 21 15:44:32.875622 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:32.875605 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" event={"ID":"7495da17-b5bf-41d4-8bbe-ba084f702c4c","Type":"ContainerDied","Data":"912f07478784ed797f4584def2c2d245ebae45777d5df06208f78d48ea377078"} Apr 21 15:44:33.880515 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:33.880463 2569 generic.go:358] "Generic (PLEG): container finished" podID="7495da17-b5bf-41d4-8bbe-ba084f702c4c" containerID="79dfd2efbaab0437f43b4d3efa8c699e5f710eea4ea2f6fbd5f25dc6a20d7de0" exitCode=0 Apr 21 15:44:33.880915 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:33.880553 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" event={"ID":"7495da17-b5bf-41d4-8bbe-ba084f702c4c","Type":"ContainerDied","Data":"79dfd2efbaab0437f43b4d3efa8c699e5f710eea4ea2f6fbd5f25dc6a20d7de0"} Apr 21 15:44:35.006679 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.006656 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:35.149145 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.149048 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/7495da17-b5bf-41d4-8bbe-ba084f702c4c-kube-api-access-9swrr\") pod \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " Apr 21 15:44:35.149145 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.149119 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-bundle\") pod \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " Apr 21 15:44:35.149326 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.149172 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-util\") pod \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\" (UID: \"7495da17-b5bf-41d4-8bbe-ba084f702c4c\") " Apr 21 15:44:35.150166 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.150129 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-bundle" (OuterVolumeSpecName: "bundle") pod "7495da17-b5bf-41d4-8bbe-ba084f702c4c" (UID: "7495da17-b5bf-41d4-8bbe-ba084f702c4c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:44:35.151365 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.151335 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7495da17-b5bf-41d4-8bbe-ba084f702c4c-kube-api-access-9swrr" (OuterVolumeSpecName: "kube-api-access-9swrr") pod "7495da17-b5bf-41d4-8bbe-ba084f702c4c" (UID: "7495da17-b5bf-41d4-8bbe-ba084f702c4c"). InnerVolumeSpecName "kube-api-access-9swrr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:44:35.154591 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.154571 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-util" (OuterVolumeSpecName: "util") pod "7495da17-b5bf-41d4-8bbe-ba084f702c4c" (UID: "7495da17-b5bf-41d4-8bbe-ba084f702c4c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:44:35.249900 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.249865 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/7495da17-b5bf-41d4-8bbe-ba084f702c4c-kube-api-access-9swrr\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:44:35.249900 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.249899 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:44:35.249900 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.249909 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7495da17-b5bf-41d4-8bbe-ba084f702c4c-util\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:44:35.888926 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.888888 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" event={"ID":"7495da17-b5bf-41d4-8bbe-ba084f702c4c","Type":"ContainerDied","Data":"4fc92b4d208b8c076628deefa303f72dc931e4514ba383eab8bee1cc2ade4460"} Apr 21 15:44:35.888926 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.888924 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fc92b4d208b8c076628deefa303f72dc931e4514ba383eab8bee1cc2ade4460" Apr 21 15:44:35.889124 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:35.888949 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22dxqt" Apr 21 15:44:36.450039 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.450006 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-fb57bc5dc-rdphb"] Apr 21 15:44:36.450416 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.450326 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7495da17-b5bf-41d4-8bbe-ba084f702c4c" containerName="util" Apr 21 15:44:36.450416 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.450339 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7495da17-b5bf-41d4-8bbe-ba084f702c4c" containerName="util" Apr 21 15:44:36.450416 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.450350 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7495da17-b5bf-41d4-8bbe-ba084f702c4c" containerName="pull" Apr 21 15:44:36.450416 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.450356 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7495da17-b5bf-41d4-8bbe-ba084f702c4c" containerName="pull" Apr 21 15:44:36.450416 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.450363 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7495da17-b5bf-41d4-8bbe-ba084f702c4c" containerName="extract" Apr 21 15:44:36.450416 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.450368 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7495da17-b5bf-41d4-8bbe-ba084f702c4c" containerName="extract" Apr 21 15:44:36.450416 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.450419 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7495da17-b5bf-41d4-8bbe-ba084f702c4c" containerName="extract" Apr 21 15:44:36.455121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.455098 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.464916 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.464892 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fb57bc5dc-rdphb"] Apr 21 15:44:36.561826 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.561787 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-service-ca\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.561826 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.561825 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-oauth-serving-cert\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.562026 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.561847 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-console-config\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.562026 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.561952 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-trusted-ca-bundle\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.562026 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.561989 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-console-oauth-config\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.562026 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.562015 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-console-serving-cert\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.562153 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.562038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td5sd\" (UniqueName: \"kubernetes.io/projected/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-kube-api-access-td5sd\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.663417 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.663382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-trusted-ca-bundle\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.663417 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.663421 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-console-oauth-config\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.663634 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.663506 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-console-serving-cert\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.663634 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.663540 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-td5sd\" (UniqueName: \"kubernetes.io/projected/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-kube-api-access-td5sd\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.663634 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.663578 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-service-ca\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.663634 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.663594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-oauth-serving-cert\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.663634 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.663620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-console-config\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.664321 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.664298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-console-config\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.664429 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.664351 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-service-ca\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.664429 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.664408 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-trusted-ca-bundle\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.664528 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.664429 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-oauth-serving-cert\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.666067 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.666037 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-console-serving-cert\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.666164 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.666093 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-console-oauth-config\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.671542 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.671521 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-td5sd\" (UniqueName: \"kubernetes.io/projected/ad438104-e8e8-4a3d-b4b1-086c7fd193a6-kube-api-access-td5sd\") pod \"console-fb57bc5dc-rdphb\" (UID: \"ad438104-e8e8-4a3d-b4b1-086c7fd193a6\") " pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.764828 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.764723 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:36.893236 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:36.893211 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fb57bc5dc-rdphb"] Apr 21 15:44:36.894921 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:44:36.894889 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad438104_e8e8_4a3d_b4b1_086c7fd193a6.slice/crio-991f66a8c706db792148a60085239fc3a139f768a9a8b61292f8b9e59592160b WatchSource:0}: Error finding container 991f66a8c706db792148a60085239fc3a139f768a9a8b61292f8b9e59592160b: Status 404 returned error can't find the container with id 991f66a8c706db792148a60085239fc3a139f768a9a8b61292f8b9e59592160b Apr 21 15:44:37.896768 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:37.896735 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fb57bc5dc-rdphb" event={"ID":"ad438104-e8e8-4a3d-b4b1-086c7fd193a6","Type":"ContainerStarted","Data":"c7927ee9377986d3855d4e3518a4b0744403a1b3c547d4d430159bb869239fb5"} Apr 21 15:44:37.896768 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:37.896770 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fb57bc5dc-rdphb" event={"ID":"ad438104-e8e8-4a3d-b4b1-086c7fd193a6","Type":"ContainerStarted","Data":"991f66a8c706db792148a60085239fc3a139f768a9a8b61292f8b9e59592160b"} Apr 21 15:44:37.919225 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:37.919172 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-fb57bc5dc-rdphb" podStartSLOduration=1.919157259 podStartE2EDuration="1.919157259s" podCreationTimestamp="2026-04-21 15:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:44:37.916390377 +0000 UTC m=+553.488423330" watchObservedRunningTime="2026-04-21 15:44:37.919157259 +0000 UTC m=+553.491190203" Apr 21 15:44:39.863963 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:39.863928 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-fb974466f-4bczq" Apr 21 15:44:46.765881 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:46.765836 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:46.766268 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:46.765895 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:46.770621 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:46.770594 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:46.931410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:46.931385 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-fb57bc5dc-rdphb" Apr 21 15:44:47.038053 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:44:47.037982 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8575f75ff6-hjp2t"] Apr 21 15:45:00.778380 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.778342 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6"] Apr 21 15:45:00.783262 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.783242 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:00.787031 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.787005 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:45:00.788071 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.788051 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q22xw\"" Apr 21 15:45:00.788071 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.788057 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:45:00.798008 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.797977 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6"] Apr 21 15:45:00.864231 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.864199 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9"] Apr 21 15:45:00.867681 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.867664 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:00.877496 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.877456 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9"] Apr 21 15:45:00.965659 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.965623 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9"] Apr 21 15:45:00.969042 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.969024 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:00.975145 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.975105 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:00.975306 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.975151 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v6q7\" (UniqueName: \"kubernetes.io/projected/42c50b7b-6e3a-4096-a202-29b924d01c73-kube-api-access-8v6q7\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:00.975306 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.975234 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:00.975566 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.975363 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf2m7\" (UniqueName: \"kubernetes.io/projected/d58ab93d-efd6-42a4-b678-f69ab4fec878-kube-api-access-vf2m7\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:00.975566 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.975436 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:00.976157 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.976128 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:00.978771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:00.978751 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9"] Apr 21 15:45:01.069925 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.069840 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9"] Apr 21 15:45:01.073173 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.073151 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:01.076868 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.076846 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:01.076972 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.076881 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h9h7\" (UniqueName: \"kubernetes.io/projected/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-kube-api-access-4h9h7\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:01.076972 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.076906 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vf2m7\" (UniqueName: \"kubernetes.io/projected/d58ab93d-efd6-42a4-b678-f69ab4fec878-kube-api-access-vf2m7\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:01.076972 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.076925 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:01.077128 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.077042 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:01.077128 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.077120 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:01.077239 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.077162 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:01.077297 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.077251 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:01.077297 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.077258 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:01.077297 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.077284 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v6q7\" (UniqueName: \"kubernetes.io/projected/42c50b7b-6e3a-4096-a202-29b924d01c73-kube-api-access-8v6q7\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:01.077451 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.077344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:01.077525 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.077456 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:01.077525 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.077466 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:01.082635 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.082611 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9"] Apr 21 15:45:01.086786 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.086767 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf2m7\" (UniqueName: \"kubernetes.io/projected/d58ab93d-efd6-42a4-b678-f69ab4fec878-kube-api-access-vf2m7\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:01.087342 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.087324 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v6q7\" (UniqueName: \"kubernetes.io/projected/42c50b7b-6e3a-4096-a202-29b924d01c73-kube-api-access-8v6q7\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:01.093188 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.093166 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:01.176747 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.176654 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:01.177835 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.177807 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h9h7\" (UniqueName: \"kubernetes.io/projected/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-kube-api-access-4h9h7\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:01.177926 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.177859 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:01.177926 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.177905 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:01.178041 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.177950 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:01.178041 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.177995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktgck\" (UniqueName: \"kubernetes.io/projected/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-kube-api-access-ktgck\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:01.178153 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.178055 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:01.178316 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.178287 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:01.178562 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.178525 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:01.189387 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.189358 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h9h7\" (UniqueName: \"kubernetes.io/projected/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-kube-api-access-4h9h7\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:01.221454 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.221421 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6"] Apr 21 15:45:01.225549 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:45:01.225513 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c50b7b_6e3a_4096_a202_29b924d01c73.slice/crio-3932fd6118aa0d6dfb80d6f0f75c17b77482786e360bdbb6c48e5303ffb1a784 WatchSource:0}: Error finding container 3932fd6118aa0d6dfb80d6f0f75c17b77482786e360bdbb6c48e5303ffb1a784: Status 404 returned error can't find the container with id 3932fd6118aa0d6dfb80d6f0f75c17b77482786e360bdbb6c48e5303ffb1a784 Apr 21 15:45:01.279025 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.278991 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:01.279340 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.279315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:01.279426 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.279390 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktgck\" (UniqueName: \"kubernetes.io/projected/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-kube-api-access-ktgck\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:01.279503 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.279443 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:01.279804 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.279772 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:01.280025 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.279854 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:01.291243 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.291205 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktgck\" (UniqueName: \"kubernetes.io/projected/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-kube-api-access-ktgck\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:01.316715 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.316678 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9"] Apr 21 15:45:01.320167 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:45:01.320085 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd58ab93d_efd6_42a4_b678_f69ab4fec878.slice/crio-998d9fe596ff23e7eaf2f1064ebc3e1ca3d4c9fdb2ed90ff26d88f1fe144448e WatchSource:0}: Error finding container 998d9fe596ff23e7eaf2f1064ebc3e1ca3d4c9fdb2ed90ff26d88f1fe144448e: Status 404 returned error can't find the container with id 998d9fe596ff23e7eaf2f1064ebc3e1ca3d4c9fdb2ed90ff26d88f1fe144448e Apr 21 15:45:01.383128 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.383092 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:01.425274 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.425183 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9"] Apr 21 15:45:01.427495 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:45:01.427450 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a176f36_fb8d_4ab8_9b0c_35bf2be53527.slice/crio-44c7ddf36cc154e942061c131d07ef55e5c9162bec8d54aa8b6913ef45794208 WatchSource:0}: Error finding container 44c7ddf36cc154e942061c131d07ef55e5c9162bec8d54aa8b6913ef45794208: Status 404 returned error can't find the container with id 44c7ddf36cc154e942061c131d07ef55e5c9162bec8d54aa8b6913ef45794208 Apr 21 15:45:01.550654 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.550623 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9"] Apr 21 15:45:01.588408 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:45:01.588337 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43bb6262_1d2f_4dfb_8a8b_237b4d4a2940.slice/crio-404002e2bc8867ced2e361a685a90e9325786332a5fe64e864a3d3b60ac4b893 WatchSource:0}: Error finding container 404002e2bc8867ced2e361a685a90e9325786332a5fe64e864a3d3b60ac4b893: Status 404 returned error can't find the container with id 404002e2bc8867ced2e361a685a90e9325786332a5fe64e864a3d3b60ac4b893 Apr 21 15:45:01.988574 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.988541 2569 generic.go:358] "Generic (PLEG): container finished" podID="6a176f36-fb8d-4ab8-9b0c-35bf2be53527" containerID="df91eeb94d95bcb70d58affa22f41aa9cf50d79f30f70f77e8af613072e307a4" exitCode=0 Apr 21 15:45:01.988990 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.988631 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" event={"ID":"6a176f36-fb8d-4ab8-9b0c-35bf2be53527","Type":"ContainerDied","Data":"df91eeb94d95bcb70d58affa22f41aa9cf50d79f30f70f77e8af613072e307a4"} Apr 21 15:45:01.988990 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.988673 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" event={"ID":"6a176f36-fb8d-4ab8-9b0c-35bf2be53527","Type":"ContainerStarted","Data":"44c7ddf36cc154e942061c131d07ef55e5c9162bec8d54aa8b6913ef45794208"} Apr 21 15:45:01.990049 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.989987 2569 generic.go:358] "Generic (PLEG): container finished" podID="d58ab93d-efd6-42a4-b678-f69ab4fec878" containerID="515e3080bb4bfef410eb0c6a8bb0caa65a726230819a9bdf1a498804ce6d9df2" exitCode=0 Apr 21 15:45:01.990104 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.990064 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" event={"ID":"d58ab93d-efd6-42a4-b678-f69ab4fec878","Type":"ContainerDied","Data":"515e3080bb4bfef410eb0c6a8bb0caa65a726230819a9bdf1a498804ce6d9df2"} Apr 21 15:45:01.990104 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.990089 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" event={"ID":"d58ab93d-efd6-42a4-b678-f69ab4fec878","Type":"ContainerStarted","Data":"998d9fe596ff23e7eaf2f1064ebc3e1ca3d4c9fdb2ed90ff26d88f1fe144448e"} Apr 21 15:45:01.991418 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.991391 2569 generic.go:358] "Generic (PLEG): container finished" podID="43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" containerID="f0d7e282a8d2146b0fac9682a28fd3dc165787d4a54a51204828cb76dbb5eccb" exitCode=0 Apr 21 15:45:01.991527 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.991420 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" event={"ID":"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940","Type":"ContainerDied","Data":"f0d7e282a8d2146b0fac9682a28fd3dc165787d4a54a51204828cb76dbb5eccb"} Apr 21 15:45:01.991527 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.991448 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" event={"ID":"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940","Type":"ContainerStarted","Data":"404002e2bc8867ced2e361a685a90e9325786332a5fe64e864a3d3b60ac4b893"} Apr 21 15:45:01.993117 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.993092 2569 generic.go:358] "Generic (PLEG): container finished" podID="42c50b7b-6e3a-4096-a202-29b924d01c73" containerID="bfb9b20ddae1b49384a54e5e579063caf72b8d5e3f92c82c72cb9b7e06359a1e" exitCode=0 Apr 21 15:45:01.993234 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.993115 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" event={"ID":"42c50b7b-6e3a-4096-a202-29b924d01c73","Type":"ContainerDied","Data":"bfb9b20ddae1b49384a54e5e579063caf72b8d5e3f92c82c72cb9b7e06359a1e"} Apr 21 15:45:01.993234 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:01.993142 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" event={"ID":"42c50b7b-6e3a-4096-a202-29b924d01c73","Type":"ContainerStarted","Data":"3932fd6118aa0d6dfb80d6f0f75c17b77482786e360bdbb6c48e5303ffb1a784"} Apr 21 15:45:04.001520 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:04.001453 2569 generic.go:358] "Generic (PLEG): container finished" podID="6a176f36-fb8d-4ab8-9b0c-35bf2be53527" containerID="fcaa840d4c6549b0e13638311a8b72f5bf70d520e1d7b0fc86a5879f6d57865b" exitCode=0 Apr 21 15:45:04.001918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:04.001519 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" event={"ID":"6a176f36-fb8d-4ab8-9b0c-35bf2be53527","Type":"ContainerDied","Data":"fcaa840d4c6549b0e13638311a8b72f5bf70d520e1d7b0fc86a5879f6d57865b"} Apr 21 15:45:04.003154 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:04.003022 2569 generic.go:358] "Generic (PLEG): container finished" podID="43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" containerID="1520d397418030abae59fd56fc2d1c8a8b24b449b57cfa9b179b2c1a80fabca6" exitCode=0 Apr 21 15:45:04.003154 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:04.003053 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" event={"ID":"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940","Type":"ContainerDied","Data":"1520d397418030abae59fd56fc2d1c8a8b24b449b57cfa9b179b2c1a80fabca6"} Apr 21 15:45:04.004747 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:04.004729 2569 generic.go:358] "Generic (PLEG): container finished" podID="42c50b7b-6e3a-4096-a202-29b924d01c73" containerID="a790d34fe5d40b60ae726d6fc2c852a273ee42ceb7dcaf36dfdff21eefe0b0d9" exitCode=0 Apr 21 15:45:04.004832 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:04.004811 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" event={"ID":"42c50b7b-6e3a-4096-a202-29b924d01c73","Type":"ContainerDied","Data":"a790d34fe5d40b60ae726d6fc2c852a273ee42ceb7dcaf36dfdff21eefe0b0d9"} Apr 21 15:45:05.014100 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:05.014065 2569 generic.go:358] "Generic (PLEG): container finished" podID="6a176f36-fb8d-4ab8-9b0c-35bf2be53527" containerID="6432f26301cc506f2db0af326d756c86deb5ab6519e948c606a71f949de3f3ed" exitCode=0 Apr 21 15:45:05.014537 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:05.014147 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" event={"ID":"6a176f36-fb8d-4ab8-9b0c-35bf2be53527","Type":"ContainerDied","Data":"6432f26301cc506f2db0af326d756c86deb5ab6519e948c606a71f949de3f3ed"} Apr 21 15:45:05.015766 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:05.015742 2569 generic.go:358] "Generic (PLEG): container finished" podID="d58ab93d-efd6-42a4-b678-f69ab4fec878" containerID="b59edc2f1d0b8476998c7c6b84b50aef700b4ba35a2662bd58feb8e99f074355" exitCode=0 Apr 21 15:45:05.015904 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:05.015829 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" event={"ID":"d58ab93d-efd6-42a4-b678-f69ab4fec878","Type":"ContainerDied","Data":"b59edc2f1d0b8476998c7c6b84b50aef700b4ba35a2662bd58feb8e99f074355"} Apr 21 15:45:05.017788 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:05.017768 2569 generic.go:358] "Generic (PLEG): container finished" podID="43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" containerID="3a5b4c390b07e6d44929ab1c0b8a2657d8bc7025b397f72cc0b82d765e19aac8" exitCode=0 Apr 21 15:45:05.017876 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:05.017834 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" event={"ID":"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940","Type":"ContainerDied","Data":"3a5b4c390b07e6d44929ab1c0b8a2657d8bc7025b397f72cc0b82d765e19aac8"} Apr 21 15:45:05.019702 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:05.019680 2569 generic.go:358] "Generic (PLEG): container finished" podID="42c50b7b-6e3a-4096-a202-29b924d01c73" containerID="5b526babb24c94fdc267e6d8347e964e705afd97291d364b7b948a2ba02eb3b3" exitCode=0 Apr 21 15:45:05.019785 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:05.019748 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" event={"ID":"42c50b7b-6e3a-4096-a202-29b924d01c73","Type":"ContainerDied","Data":"5b526babb24c94fdc267e6d8347e964e705afd97291d364b7b948a2ba02eb3b3"} Apr 21 15:45:06.025655 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.025616 2569 generic.go:358] "Generic (PLEG): container finished" podID="d58ab93d-efd6-42a4-b678-f69ab4fec878" containerID="f434941f65bfbd7ab525861cc267caef2607609a4aad6ae52b1d161b278ef643" exitCode=0 Apr 21 15:45:06.026095 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.025694 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" event={"ID":"d58ab93d-efd6-42a4-b678-f69ab4fec878","Type":"ContainerDied","Data":"f434941f65bfbd7ab525861cc267caef2607609a4aad6ae52b1d161b278ef643"} Apr 21 15:45:06.196336 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.196309 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:06.199706 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.199683 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:06.205363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.205335 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:06.221897 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.221870 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-util\") pod \"42c50b7b-6e3a-4096-a202-29b924d01c73\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " Apr 21 15:45:06.222036 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.221909 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-util\") pod \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " Apr 21 15:45:06.222036 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.221935 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-util\") pod \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " Apr 21 15:45:06.222036 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.221960 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-bundle\") pod \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " Apr 21 15:45:06.222036 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.221989 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-bundle\") pod \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " Apr 21 15:45:06.222036 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.222035 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v6q7\" (UniqueName: \"kubernetes.io/projected/42c50b7b-6e3a-4096-a202-29b924d01c73-kube-api-access-8v6q7\") pod \"42c50b7b-6e3a-4096-a202-29b924d01c73\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " Apr 21 15:45:06.222316 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.222117 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h9h7\" (UniqueName: \"kubernetes.io/projected/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-kube-api-access-4h9h7\") pod \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\" (UID: \"6a176f36-fb8d-4ab8-9b0c-35bf2be53527\") " Apr 21 15:45:06.222316 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.222144 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktgck\" (UniqueName: \"kubernetes.io/projected/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-kube-api-access-ktgck\") pod \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\" (UID: \"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940\") " Apr 21 15:45:06.222316 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.222177 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-bundle\") pod \"42c50b7b-6e3a-4096-a202-29b924d01c73\" (UID: \"42c50b7b-6e3a-4096-a202-29b924d01c73\") " Apr 21 15:45:06.223186 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.222847 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-bundle" (OuterVolumeSpecName: "bundle") pod "42c50b7b-6e3a-4096-a202-29b924d01c73" (UID: "42c50b7b-6e3a-4096-a202-29b924d01c73"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:45:06.223186 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.223052 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-bundle" (OuterVolumeSpecName: "bundle") pod "6a176f36-fb8d-4ab8-9b0c-35bf2be53527" (UID: "6a176f36-fb8d-4ab8-9b0c-35bf2be53527"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:45:06.223186 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.223208 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-bundle" (OuterVolumeSpecName: "bundle") pod "43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" (UID: "43bb6262-1d2f-4dfb-8a8b-237b4d4a2940"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:45:06.224924 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.224871 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c50b7b-6e3a-4096-a202-29b924d01c73-kube-api-access-8v6q7" (OuterVolumeSpecName: "kube-api-access-8v6q7") pod "42c50b7b-6e3a-4096-a202-29b924d01c73" (UID: "42c50b7b-6e3a-4096-a202-29b924d01c73"). InnerVolumeSpecName "kube-api-access-8v6q7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:45:06.225381 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.225355 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-kube-api-access-4h9h7" (OuterVolumeSpecName: "kube-api-access-4h9h7") pod "6a176f36-fb8d-4ab8-9b0c-35bf2be53527" (UID: "6a176f36-fb8d-4ab8-9b0c-35bf2be53527"). InnerVolumeSpecName "kube-api-access-4h9h7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:45:06.225721 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.225692 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-kube-api-access-ktgck" (OuterVolumeSpecName: "kube-api-access-ktgck") pod "43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" (UID: "43bb6262-1d2f-4dfb-8a8b-237b4d4a2940"). InnerVolumeSpecName "kube-api-access-ktgck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:45:06.227895 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.227875 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-util" (OuterVolumeSpecName: "util") pod "6a176f36-fb8d-4ab8-9b0c-35bf2be53527" (UID: "6a176f36-fb8d-4ab8-9b0c-35bf2be53527"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:45:06.228234 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.228214 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-util" (OuterVolumeSpecName: "util") pod "43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" (UID: "43bb6262-1d2f-4dfb-8a8b-237b4d4a2940"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:45:06.228814 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.228793 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-util" (OuterVolumeSpecName: "util") pod "42c50b7b-6e3a-4096-a202-29b924d01c73" (UID: "42c50b7b-6e3a-4096-a202-29b924d01c73"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:45:06.323778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.323690 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ktgck\" (UniqueName: \"kubernetes.io/projected/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-kube-api-access-ktgck\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:06.323778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.323723 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:06.323778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.323733 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42c50b7b-6e3a-4096-a202-29b924d01c73-util\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:06.323778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.323741 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-util\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:06.323778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.323751 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-util\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:06.323778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.323759 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43bb6262-1d2f-4dfb-8a8b-237b4d4a2940-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:06.323778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.323767 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:06.323778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.323775 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8v6q7\" (UniqueName: \"kubernetes.io/projected/42c50b7b-6e3a-4096-a202-29b924d01c73-kube-api-access-8v6q7\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:06.323778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:06.323786 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4h9h7\" (UniqueName: \"kubernetes.io/projected/6a176f36-fb8d-4ab8-9b0c-35bf2be53527-kube-api-access-4h9h7\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:07.031523 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.031472 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" event={"ID":"43bb6262-1d2f-4dfb-8a8b-237b4d4a2940","Type":"ContainerDied","Data":"404002e2bc8867ced2e361a685a90e9325786332a5fe64e864a3d3b60ac4b893"} Apr 21 15:45:07.031523 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.031523 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="404002e2bc8867ced2e361a685a90e9325786332a5fe64e864a3d3b60ac4b893" Apr 21 15:45:07.031977 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.031520 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30cf8t9" Apr 21 15:45:07.033181 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.033150 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" event={"ID":"42c50b7b-6e3a-4096-a202-29b924d01c73","Type":"ContainerDied","Data":"3932fd6118aa0d6dfb80d6f0f75c17b77482786e360bdbb6c48e5303ffb1a784"} Apr 21 15:45:07.033181 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.033169 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bmsjd6" Apr 21 15:45:07.033394 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.033185 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3932fd6118aa0d6dfb80d6f0f75c17b77482786e360bdbb6c48e5303ffb1a784" Apr 21 15:45:07.035013 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.034990 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" Apr 21 15:45:07.035013 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.035004 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fr9r9" event={"ID":"6a176f36-fb8d-4ab8-9b0c-35bf2be53527","Type":"ContainerDied","Data":"44c7ddf36cc154e942061c131d07ef55e5c9162bec8d54aa8b6913ef45794208"} Apr 21 15:45:07.035190 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.035032 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c7ddf36cc154e942061c131d07ef55e5c9162bec8d54aa8b6913ef45794208" Apr 21 15:45:07.155262 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.155238 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:07.230768 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.230714 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-util\") pod \"d58ab93d-efd6-42a4-b678-f69ab4fec878\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " Apr 21 15:45:07.230768 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.230782 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-bundle\") pod \"d58ab93d-efd6-42a4-b678-f69ab4fec878\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " Apr 21 15:45:07.231023 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.230816 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf2m7\" (UniqueName: \"kubernetes.io/projected/d58ab93d-efd6-42a4-b678-f69ab4fec878-kube-api-access-vf2m7\") pod \"d58ab93d-efd6-42a4-b678-f69ab4fec878\" (UID: \"d58ab93d-efd6-42a4-b678-f69ab4fec878\") " Apr 21 15:45:07.231408 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.231384 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-bundle" (OuterVolumeSpecName: "bundle") pod "d58ab93d-efd6-42a4-b678-f69ab4fec878" (UID: "d58ab93d-efd6-42a4-b678-f69ab4fec878"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:45:07.232949 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.232922 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d58ab93d-efd6-42a4-b678-f69ab4fec878-kube-api-access-vf2m7" (OuterVolumeSpecName: "kube-api-access-vf2m7") pod "d58ab93d-efd6-42a4-b678-f69ab4fec878" (UID: "d58ab93d-efd6-42a4-b678-f69ab4fec878"). InnerVolumeSpecName "kube-api-access-vf2m7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:45:07.235220 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.235184 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-util" (OuterVolumeSpecName: "util") pod "d58ab93d-efd6-42a4-b678-f69ab4fec878" (UID: "d58ab93d-efd6-42a4-b678-f69ab4fec878"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:45:07.331985 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.331891 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-util\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:07.331985 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.331925 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d58ab93d-efd6-42a4-b678-f69ab4fec878-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:07.331985 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:07.331935 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vf2m7\" (UniqueName: \"kubernetes.io/projected/d58ab93d-efd6-42a4-b678-f69ab4fec878-kube-api-access-vf2m7\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:08.040015 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:08.039983 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" Apr 21 15:45:08.040015 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:08.039992 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503xv9w9" event={"ID":"d58ab93d-efd6-42a4-b678-f69ab4fec878","Type":"ContainerDied","Data":"998d9fe596ff23e7eaf2f1064ebc3e1ca3d4c9fdb2ed90ff26d88f1fe144448e"} Apr 21 15:45:08.040015 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:08.040021 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="998d9fe596ff23e7eaf2f1064ebc3e1ca3d4c9fdb2ed90ff26d88f1fe144448e" Apr 21 15:45:12.060025 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.059982 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8575f75ff6-hjp2t" podUID="0b53dc26-94b9-4fe8-9ac6-40239830cc3c" containerName="console" containerID="cri-o://a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f" gracePeriod=15 Apr 21 15:45:12.304158 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.304134 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8575f75ff6-hjp2t_0b53dc26-94b9-4fe8-9ac6-40239830cc3c/console/0.log" Apr 21 15:45:12.304267 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.304193 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:45:12.370341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.370261 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvt5q\" (UniqueName: \"kubernetes.io/projected/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-kube-api-access-pvt5q\") pod \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " Apr 21 15:45:12.370341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.370312 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-oauth-config\") pod \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " Apr 21 15:45:12.370603 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.370355 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-oauth-serving-cert\") pod \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " Apr 21 15:45:12.370603 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.370376 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-trusted-ca-bundle\") pod \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " Apr 21 15:45:12.370603 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.370403 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-service-ca\") pod \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " Apr 21 15:45:12.370603 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.370425 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-config\") pod \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " Apr 21 15:45:12.370603 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.370472 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-serving-cert\") pod \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\" (UID: \"0b53dc26-94b9-4fe8-9ac6-40239830cc3c\") " Apr 21 15:45:12.370934 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.370905 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b53dc26-94b9-4fe8-9ac6-40239830cc3c" (UID: "0b53dc26-94b9-4fe8-9ac6-40239830cc3c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:45:12.371051 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.370928 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-config" (OuterVolumeSpecName: "console-config") pod "0b53dc26-94b9-4fe8-9ac6-40239830cc3c" (UID: "0b53dc26-94b9-4fe8-9ac6-40239830cc3c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:45:12.371051 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.370906 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0b53dc26-94b9-4fe8-9ac6-40239830cc3c" (UID: "0b53dc26-94b9-4fe8-9ac6-40239830cc3c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:45:12.371051 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.371000 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0b53dc26-94b9-4fe8-9ac6-40239830cc3c" (UID: "0b53dc26-94b9-4fe8-9ac6-40239830cc3c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:45:12.372554 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.372530 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0b53dc26-94b9-4fe8-9ac6-40239830cc3c" (UID: "0b53dc26-94b9-4fe8-9ac6-40239830cc3c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:45:12.372653 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.372571 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-kube-api-access-pvt5q" (OuterVolumeSpecName: "kube-api-access-pvt5q") pod "0b53dc26-94b9-4fe8-9ac6-40239830cc3c" (UID: "0b53dc26-94b9-4fe8-9ac6-40239830cc3c"). InnerVolumeSpecName "kube-api-access-pvt5q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:45:12.372653 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.372645 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0b53dc26-94b9-4fe8-9ac6-40239830cc3c" (UID: "0b53dc26-94b9-4fe8-9ac6-40239830cc3c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:45:12.471661 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.471619 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-serving-cert\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:12.471661 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.471648 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pvt5q\" (UniqueName: \"kubernetes.io/projected/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-kube-api-access-pvt5q\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:12.471661 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.471658 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-oauth-config\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:12.471661 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.471668 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-oauth-serving-cert\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:12.471917 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.471677 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-trusted-ca-bundle\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:12.471917 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.471686 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-service-ca\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:12.471917 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:12.471695 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b53dc26-94b9-4fe8-9ac6-40239830cc3c-console-config\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:45:13.060906 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:13.060878 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8575f75ff6-hjp2t_0b53dc26-94b9-4fe8-9ac6-40239830cc3c/console/0.log" Apr 21 15:45:13.061299 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:13.060916 2569 generic.go:358] "Generic (PLEG): container finished" podID="0b53dc26-94b9-4fe8-9ac6-40239830cc3c" containerID="a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f" exitCode=2 Apr 21 15:45:13.061299 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:13.061007 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8575f75ff6-hjp2t" event={"ID":"0b53dc26-94b9-4fe8-9ac6-40239830cc3c","Type":"ContainerDied","Data":"a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f"} Apr 21 15:45:13.061299 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:13.061027 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8575f75ff6-hjp2t" Apr 21 15:45:13.061299 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:13.061045 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8575f75ff6-hjp2t" event={"ID":"0b53dc26-94b9-4fe8-9ac6-40239830cc3c","Type":"ContainerDied","Data":"25a85d774b329c1a8808ce3b7ad564ee1e3981f9f74f59bd08b9602546a973f3"} Apr 21 15:45:13.061299 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:13.061061 2569 scope.go:117] "RemoveContainer" containerID="a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f" Apr 21 15:45:13.069431 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:13.069416 2569 scope.go:117] "RemoveContainer" containerID="a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f" Apr 21 15:45:13.069709 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:45:13.069691 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f\": container with ID starting with a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f not found: ID does not exist" containerID="a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f" Apr 21 15:45:13.069761 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:13.069717 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f"} err="failed to get container status \"a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f\": rpc error: code = NotFound desc = could not find container \"a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f\": container with ID starting with a88f7b2fedec20fc2107982e83a28af5e2ef05bc3a560cba3747318cf96ad20f not found: ID does not exist" Apr 21 15:45:13.095344 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:13.095312 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8575f75ff6-hjp2t"] Apr 21 15:45:13.127572 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:13.127540 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8575f75ff6-hjp2t"] Apr 21 15:45:15.054685 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:15.054651 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b53dc26-94b9-4fe8-9ac6-40239830cc3c" path="/var/lib/kubelet/pods/0b53dc26-94b9-4fe8-9ac6-40239830cc3c/volumes" Apr 21 15:45:24.961338 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:24.961312 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:45:24.961787 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:24.961760 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:45:28.946654 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946618 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s"] Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946918 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42c50b7b-6e3a-4096-a202-29b924d01c73" containerName="util" Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946928 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c50b7b-6e3a-4096-a202-29b924d01c73" containerName="util" Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946938 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" containerName="pull" Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946943 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" containerName="pull" Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946963 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a176f36-fb8d-4ab8-9b0c-35bf2be53527" containerName="pull" Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946969 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a176f36-fb8d-4ab8-9b0c-35bf2be53527" containerName="pull" Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946977 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d58ab93d-efd6-42a4-b678-f69ab4fec878" containerName="util" Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946982 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58ab93d-efd6-42a4-b678-f69ab4fec878" containerName="util" Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946987 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d58ab93d-efd6-42a4-b678-f69ab4fec878" containerName="extract" Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946992 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58ab93d-efd6-42a4-b678-f69ab4fec878" containerName="extract" Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.946999 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b53dc26-94b9-4fe8-9ac6-40239830cc3c" containerName="console" Apr 21 15:45:28.947006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947004 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b53dc26-94b9-4fe8-9ac6-40239830cc3c" containerName="console" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947014 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42c50b7b-6e3a-4096-a202-29b924d01c73" containerName="pull" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947019 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c50b7b-6e3a-4096-a202-29b924d01c73" containerName="pull" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947026 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" containerName="util" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947031 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" containerName="util" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947038 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d58ab93d-efd6-42a4-b678-f69ab4fec878" containerName="pull" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947043 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58ab93d-efd6-42a4-b678-f69ab4fec878" containerName="pull" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947051 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a176f36-fb8d-4ab8-9b0c-35bf2be53527" containerName="util" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947056 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a176f36-fb8d-4ab8-9b0c-35bf2be53527" containerName="util" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947062 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" containerName="extract" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947067 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" containerName="extract" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947075 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42c50b7b-6e3a-4096-a202-29b924d01c73" containerName="extract" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947079 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c50b7b-6e3a-4096-a202-29b924d01c73" containerName="extract" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947085 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a176f36-fb8d-4ab8-9b0c-35bf2be53527" containerName="extract" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947090 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a176f36-fb8d-4ab8-9b0c-35bf2be53527" containerName="extract" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947136 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="43bb6262-1d2f-4dfb-8a8b-237b4d4a2940" containerName="extract" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947143 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d58ab93d-efd6-42a4-b678-f69ab4fec878" containerName="extract" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947149 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="42c50b7b-6e3a-4096-a202-29b924d01c73" containerName="extract" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947157 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b53dc26-94b9-4fe8-9ac6-40239830cc3c" containerName="console" Apr 21 15:45:28.947341 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.947165 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a176f36-fb8d-4ab8-9b0c-35bf2be53527" containerName="extract" Apr 21 15:45:28.951067 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.951051 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:28.956562 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.955809 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 21 15:45:28.956562 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.956138 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 15:45:28.956562 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.956394 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pv44v\"" Apr 21 15:45:28.956562 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.956432 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 15:45:28.957427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.957222 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 21 15:45:28.962181 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:28.962156 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s"] Apr 21 15:45:29.012922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:29.012882 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3060312c-c108-46b7-b499-e2fe4957e773-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-nx62s\" (UID: \"3060312c-c108-46b7-b499-e2fe4957e773\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:29.013091 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:29.012932 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrg6d\" (UniqueName: \"kubernetes.io/projected/3060312c-c108-46b7-b499-e2fe4957e773-kube-api-access-vrg6d\") pod \"kuadrant-console-plugin-6c886788f8-nx62s\" (UID: \"3060312c-c108-46b7-b499-e2fe4957e773\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:29.013091 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:29.012962 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3060312c-c108-46b7-b499-e2fe4957e773-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-nx62s\" (UID: \"3060312c-c108-46b7-b499-e2fe4957e773\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:29.113312 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:29.113275 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrg6d\" (UniqueName: \"kubernetes.io/projected/3060312c-c108-46b7-b499-e2fe4957e773-kube-api-access-vrg6d\") pod \"kuadrant-console-plugin-6c886788f8-nx62s\" (UID: \"3060312c-c108-46b7-b499-e2fe4957e773\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:29.113459 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:29.113343 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3060312c-c108-46b7-b499-e2fe4957e773-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-nx62s\" (UID: \"3060312c-c108-46b7-b499-e2fe4957e773\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:29.113459 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:29.113427 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3060312c-c108-46b7-b499-e2fe4957e773-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-nx62s\" (UID: \"3060312c-c108-46b7-b499-e2fe4957e773\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:29.113596 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:45:29.113581 2569 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 21 15:45:29.113672 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:45:29.113658 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3060312c-c108-46b7-b499-e2fe4957e773-plugin-serving-cert podName:3060312c-c108-46b7-b499-e2fe4957e773 nodeName:}" failed. No retries permitted until 2026-04-21 15:45:29.613634723 +0000 UTC m=+605.185667658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/3060312c-c108-46b7-b499-e2fe4957e773-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-nx62s" (UID: "3060312c-c108-46b7-b499-e2fe4957e773") : secret "plugin-serving-cert" not found Apr 21 15:45:29.114059 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:29.114038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3060312c-c108-46b7-b499-e2fe4957e773-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-nx62s\" (UID: \"3060312c-c108-46b7-b499-e2fe4957e773\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:29.124511 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:29.124456 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrg6d\" (UniqueName: \"kubernetes.io/projected/3060312c-c108-46b7-b499-e2fe4957e773-kube-api-access-vrg6d\") pod \"kuadrant-console-plugin-6c886788f8-nx62s\" (UID: \"3060312c-c108-46b7-b499-e2fe4957e773\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:29.617196 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:29.617145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3060312c-c108-46b7-b499-e2fe4957e773-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-nx62s\" (UID: \"3060312c-c108-46b7-b499-e2fe4957e773\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:29.617360 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:45:29.617299 2569 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 21 15:45:29.617440 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:45:29.617426 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3060312c-c108-46b7-b499-e2fe4957e773-plugin-serving-cert podName:3060312c-c108-46b7-b499-e2fe4957e773 nodeName:}" failed. No retries permitted until 2026-04-21 15:45:30.617404415 +0000 UTC m=+606.189437336 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/3060312c-c108-46b7-b499-e2fe4957e773-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-nx62s" (UID: "3060312c-c108-46b7-b499-e2fe4957e773") : secret "plugin-serving-cert" not found Apr 21 15:45:30.624628 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:30.624597 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3060312c-c108-46b7-b499-e2fe4957e773-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-nx62s\" (UID: \"3060312c-c108-46b7-b499-e2fe4957e773\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:30.626965 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:30.626936 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3060312c-c108-46b7-b499-e2fe4957e773-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-nx62s\" (UID: \"3060312c-c108-46b7-b499-e2fe4957e773\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:30.765631 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:30.765593 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" Apr 21 15:45:30.907395 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:30.907366 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s"] Apr 21 15:45:30.908689 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:45:30.908659 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3060312c_c108_46b7_b499_e2fe4957e773.slice/crio-82abf187526421da22aefd22f106e6ed8c847a0465b8f54d4209aaf94b0058b7 WatchSource:0}: Error finding container 82abf187526421da22aefd22f106e6ed8c847a0465b8f54d4209aaf94b0058b7: Status 404 returned error can't find the container with id 82abf187526421da22aefd22f106e6ed8c847a0465b8f54d4209aaf94b0058b7 Apr 21 15:45:31.124537 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:31.124501 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" event={"ID":"3060312c-c108-46b7-b499-e2fe4957e773","Type":"ContainerStarted","Data":"82abf187526421da22aefd22f106e6ed8c847a0465b8f54d4209aaf94b0058b7"} Apr 21 15:45:36.149191 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:36.149152 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" event={"ID":"3060312c-c108-46b7-b499-e2fe4957e773","Type":"ContainerStarted","Data":"24e87b12c49763f99b6ca3ad356f4ae01226b584380928ab935bad39bc382781"} Apr 21 15:45:36.166953 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:45:36.166899 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nx62s" podStartSLOduration=3.561743233 podStartE2EDuration="8.166885113s" podCreationTimestamp="2026-04-21 15:45:28 +0000 UTC" firstStartedPulling="2026-04-21 15:45:30.910092153 +0000 UTC m=+606.482125074" lastFinishedPulling="2026-04-21 15:45:35.515234028 +0000 UTC m=+611.087266954" observedRunningTime="2026-04-21 15:45:36.164942141 +0000 UTC m=+611.736975081" watchObservedRunningTime="2026-04-21 15:45:36.166885113 +0000 UTC m=+611.738918055" Apr 21 15:46:10.627983 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.627938 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-ftf9w"] Apr 21 15:46:10.633037 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.633002 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:10.635785 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.635765 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 15:46:10.636936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.636907 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-ftf9w"] Apr 21 15:46:10.648407 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.648383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4f9160a5-c304-406e-babb-72a44777b033-config-file\") pod \"limitador-limitador-64c8f475fb-ftf9w\" (UID: \"4f9160a5-c304-406e-babb-72a44777b033\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:10.648528 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.648427 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8rkc\" (UniqueName: \"kubernetes.io/projected/4f9160a5-c304-406e-babb-72a44777b033-kube-api-access-j8rkc\") pod \"limitador-limitador-64c8f475fb-ftf9w\" (UID: \"4f9160a5-c304-406e-babb-72a44777b033\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:10.726423 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.726389 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-ftf9w"] Apr 21 15:46:10.749595 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.749560 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8rkc\" (UniqueName: \"kubernetes.io/projected/4f9160a5-c304-406e-babb-72a44777b033-kube-api-access-j8rkc\") pod \"limitador-limitador-64c8f475fb-ftf9w\" (UID: \"4f9160a5-c304-406e-babb-72a44777b033\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:10.749770 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.749666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4f9160a5-c304-406e-babb-72a44777b033-config-file\") pod \"limitador-limitador-64c8f475fb-ftf9w\" (UID: \"4f9160a5-c304-406e-babb-72a44777b033\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:10.750265 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.750246 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4f9160a5-c304-406e-babb-72a44777b033-config-file\") pod \"limitador-limitador-64c8f475fb-ftf9w\" (UID: \"4f9160a5-c304-406e-babb-72a44777b033\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:10.758287 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.758264 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8rkc\" (UniqueName: \"kubernetes.io/projected/4f9160a5-c304-406e-babb-72a44777b033-kube-api-access-j8rkc\") pod \"limitador-limitador-64c8f475fb-ftf9w\" (UID: \"4f9160a5-c304-406e-babb-72a44777b033\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:10.944385 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:10.944340 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:11.076325 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:11.076289 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-ftf9w"] Apr 21 15:46:11.078698 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:46:11.078666 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f9160a5_c304_406e_babb_72a44777b033.slice/crio-a9b20ec2db458b74ccb3a859e6043e5cb07c7d4191fde9c8ccadd57c9913af28 WatchSource:0}: Error finding container a9b20ec2db458b74ccb3a859e6043e5cb07c7d4191fde9c8ccadd57c9913af28: Status 404 returned error can't find the container with id a9b20ec2db458b74ccb3a859e6043e5cb07c7d4191fde9c8ccadd57c9913af28 Apr 21 15:46:11.080394 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:11.080377 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:46:11.273308 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:11.273222 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" event={"ID":"4f9160a5-c304-406e-babb-72a44777b033","Type":"ContainerStarted","Data":"a9b20ec2db458b74ccb3a859e6043e5cb07c7d4191fde9c8ccadd57c9913af28"} Apr 21 15:46:11.712339 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:11.712304 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-bgwt5"] Apr 21 15:46:11.716859 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:11.716841 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-bgwt5" Apr 21 15:46:11.719368 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:11.719343 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-hs69b\"" Apr 21 15:46:11.720108 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:11.720079 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-bgwt5"] Apr 21 15:46:11.755692 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:11.755661 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqrjq\" (UniqueName: \"kubernetes.io/projected/ebe6bb1e-1011-4684-a3b3-6972074d6d8b-kube-api-access-zqrjq\") pod \"authorino-79cbc94b89-bgwt5\" (UID: \"ebe6bb1e-1011-4684-a3b3-6972074d6d8b\") " pod="kuadrant-system/authorino-79cbc94b89-bgwt5" Apr 21 15:46:11.856454 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:11.856422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqrjq\" (UniqueName: \"kubernetes.io/projected/ebe6bb1e-1011-4684-a3b3-6972074d6d8b-kube-api-access-zqrjq\") pod \"authorino-79cbc94b89-bgwt5\" (UID: \"ebe6bb1e-1011-4684-a3b3-6972074d6d8b\") " pod="kuadrant-system/authorino-79cbc94b89-bgwt5" Apr 21 15:46:11.864871 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:11.864840 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqrjq\" (UniqueName: \"kubernetes.io/projected/ebe6bb1e-1011-4684-a3b3-6972074d6d8b-kube-api-access-zqrjq\") pod \"authorino-79cbc94b89-bgwt5\" (UID: \"ebe6bb1e-1011-4684-a3b3-6972074d6d8b\") " pod="kuadrant-system/authorino-79cbc94b89-bgwt5" Apr 21 15:46:12.026322 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:12.026226 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-bgwt5" Apr 21 15:46:12.164223 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:12.164198 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-bgwt5"] Apr 21 15:46:12.166873 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:46:12.166835 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebe6bb1e_1011_4684_a3b3_6972074d6d8b.slice/crio-ae43ed395f223757d8272df2b1abe9cd824fe4436110a5f21e737a5bb725a20c WatchSource:0}: Error finding container ae43ed395f223757d8272df2b1abe9cd824fe4436110a5f21e737a5bb725a20c: Status 404 returned error can't find the container with id ae43ed395f223757d8272df2b1abe9cd824fe4436110a5f21e737a5bb725a20c Apr 21 15:46:12.278501 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:12.278394 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-bgwt5" event={"ID":"ebe6bb1e-1011-4684-a3b3-6972074d6d8b","Type":"ContainerStarted","Data":"ae43ed395f223757d8272df2b1abe9cd824fe4436110a5f21e737a5bb725a20c"} Apr 21 15:46:13.284128 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:13.284088 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" event={"ID":"4f9160a5-c304-406e-babb-72a44777b033","Type":"ContainerStarted","Data":"3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5"} Apr 21 15:46:13.284530 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:13.284204 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:13.303895 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:13.303839 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" podStartSLOduration=1.8192046450000001 podStartE2EDuration="3.30382347s" podCreationTimestamp="2026-04-21 15:46:10 +0000 UTC" firstStartedPulling="2026-04-21 15:46:11.080546403 +0000 UTC m=+646.652579325" lastFinishedPulling="2026-04-21 15:46:12.565165224 +0000 UTC m=+648.137198150" observedRunningTime="2026-04-21 15:46:13.302463227 +0000 UTC m=+648.874496169" watchObservedRunningTime="2026-04-21 15:46:13.30382347 +0000 UTC m=+648.875856412" Apr 21 15:46:16.296308 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:16.296272 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-bgwt5" event={"ID":"ebe6bb1e-1011-4684-a3b3-6972074d6d8b","Type":"ContainerStarted","Data":"c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f"} Apr 21 15:46:16.313299 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:16.313246 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-bgwt5" podStartSLOduration=1.659839921 podStartE2EDuration="5.313231695s" podCreationTimestamp="2026-04-21 15:46:11 +0000 UTC" firstStartedPulling="2026-04-21 15:46:12.168170505 +0000 UTC m=+647.740203430" lastFinishedPulling="2026-04-21 15:46:15.82156228 +0000 UTC m=+651.393595204" observedRunningTime="2026-04-21 15:46:16.311101223 +0000 UTC m=+651.883134180" watchObservedRunningTime="2026-04-21 15:46:16.313231695 +0000 UTC m=+651.885264639" Apr 21 15:46:24.288649 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:24.288621 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:29.514556 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:29.514523 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-ftf9w"] Apr 21 15:46:29.514968 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:29.514734 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" podUID="4f9160a5-c304-406e-babb-72a44777b033" containerName="limitador" containerID="cri-o://3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5" gracePeriod=30 Apr 21 15:46:30.069003 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.068979 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:30.100509 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.100401 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8rkc\" (UniqueName: \"kubernetes.io/projected/4f9160a5-c304-406e-babb-72a44777b033-kube-api-access-j8rkc\") pod \"4f9160a5-c304-406e-babb-72a44777b033\" (UID: \"4f9160a5-c304-406e-babb-72a44777b033\") " Apr 21 15:46:30.100509 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.100449 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4f9160a5-c304-406e-babb-72a44777b033-config-file\") pod \"4f9160a5-c304-406e-babb-72a44777b033\" (UID: \"4f9160a5-c304-406e-babb-72a44777b033\") " Apr 21 15:46:30.100864 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.100835 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9160a5-c304-406e-babb-72a44777b033-config-file" (OuterVolumeSpecName: "config-file") pod "4f9160a5-c304-406e-babb-72a44777b033" (UID: "4f9160a5-c304-406e-babb-72a44777b033"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:46:30.102752 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.102726 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f9160a5-c304-406e-babb-72a44777b033-kube-api-access-j8rkc" (OuterVolumeSpecName: "kube-api-access-j8rkc") pod "4f9160a5-c304-406e-babb-72a44777b033" (UID: "4f9160a5-c304-406e-babb-72a44777b033"). InnerVolumeSpecName "kube-api-access-j8rkc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:46:30.201348 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.201311 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8rkc\" (UniqueName: \"kubernetes.io/projected/4f9160a5-c304-406e-babb-72a44777b033-kube-api-access-j8rkc\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:46:30.201348 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.201347 2569 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4f9160a5-c304-406e-babb-72a44777b033-config-file\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:46:30.347030 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.346993 2569 generic.go:358] "Generic (PLEG): container finished" podID="4f9160a5-c304-406e-babb-72a44777b033" containerID="3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5" exitCode=0 Apr 21 15:46:30.347196 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.347050 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" event={"ID":"4f9160a5-c304-406e-babb-72a44777b033","Type":"ContainerDied","Data":"3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5"} Apr 21 15:46:30.347196 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.347076 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" event={"ID":"4f9160a5-c304-406e-babb-72a44777b033","Type":"ContainerDied","Data":"a9b20ec2db458b74ccb3a859e6043e5cb07c7d4191fde9c8ccadd57c9913af28"} Apr 21 15:46:30.347196 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.347080 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-ftf9w" Apr 21 15:46:30.347196 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.347091 2569 scope.go:117] "RemoveContainer" containerID="3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5" Apr 21 15:46:30.356401 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.356377 2569 scope.go:117] "RemoveContainer" containerID="3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5" Apr 21 15:46:30.356687 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:46:30.356666 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5\": container with ID starting with 3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5 not found: ID does not exist" containerID="3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5" Apr 21 15:46:30.356745 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.356695 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5"} err="failed to get container status \"3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5\": rpc error: code = NotFound desc = could not find container \"3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5\": container with ID starting with 3a5b49154b9917364d62fdd0113a801760e08e151eca83fbcd2c9004bcba95d5 not found: ID does not exist" Apr 21 15:46:30.369489 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.369455 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-ftf9w"] Apr 21 15:46:30.374575 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:30.374552 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-ftf9w"] Apr 21 15:46:31.055023 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:31.054987 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f9160a5-c304-406e-babb-72a44777b033" path="/var/lib/kubelet/pods/4f9160a5-c304-406e-babb-72a44777b033/volumes" Apr 21 15:46:38.213870 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.213836 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-4snmx"] Apr 21 15:46:38.214243 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.214148 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f9160a5-c304-406e-babb-72a44777b033" containerName="limitador" Apr 21 15:46:38.214243 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.214159 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9160a5-c304-406e-babb-72a44777b033" containerName="limitador" Apr 21 15:46:38.214243 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.214222 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f9160a5-c304-406e-babb-72a44777b033" containerName="limitador" Apr 21 15:46:38.218675 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.218658 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-4snmx" Apr 21 15:46:38.221538 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.221513 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 21 15:46:38.226384 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.226360 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-4snmx"] Apr 21 15:46:38.268241 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.268208 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f2e1d285-ef28-46cc-bdca-484abff6dcb4-tls-cert\") pod \"authorino-68bd676465-4snmx\" (UID: \"f2e1d285-ef28-46cc-bdca-484abff6dcb4\") " pod="kuadrant-system/authorino-68bd676465-4snmx" Apr 21 15:46:38.268241 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.268246 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p9rc\" (UniqueName: \"kubernetes.io/projected/f2e1d285-ef28-46cc-bdca-484abff6dcb4-kube-api-access-7p9rc\") pod \"authorino-68bd676465-4snmx\" (UID: \"f2e1d285-ef28-46cc-bdca-484abff6dcb4\") " pod="kuadrant-system/authorino-68bd676465-4snmx" Apr 21 15:46:38.369629 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.369588 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f2e1d285-ef28-46cc-bdca-484abff6dcb4-tls-cert\") pod \"authorino-68bd676465-4snmx\" (UID: \"f2e1d285-ef28-46cc-bdca-484abff6dcb4\") " pod="kuadrant-system/authorino-68bd676465-4snmx" Apr 21 15:46:38.369629 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.369628 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7p9rc\" (UniqueName: \"kubernetes.io/projected/f2e1d285-ef28-46cc-bdca-484abff6dcb4-kube-api-access-7p9rc\") pod \"authorino-68bd676465-4snmx\" (UID: \"f2e1d285-ef28-46cc-bdca-484abff6dcb4\") " pod="kuadrant-system/authorino-68bd676465-4snmx" Apr 21 15:46:38.372031 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.372006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f2e1d285-ef28-46cc-bdca-484abff6dcb4-tls-cert\") pod \"authorino-68bd676465-4snmx\" (UID: \"f2e1d285-ef28-46cc-bdca-484abff6dcb4\") " pod="kuadrant-system/authorino-68bd676465-4snmx" Apr 21 15:46:38.380246 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.380222 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p9rc\" (UniqueName: \"kubernetes.io/projected/f2e1d285-ef28-46cc-bdca-484abff6dcb4-kube-api-access-7p9rc\") pod \"authorino-68bd676465-4snmx\" (UID: \"f2e1d285-ef28-46cc-bdca-484abff6dcb4\") " pod="kuadrant-system/authorino-68bd676465-4snmx" Apr 21 15:46:38.529236 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.529155 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-4snmx" Apr 21 15:46:38.886032 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:38.885964 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-4snmx"] Apr 21 15:46:38.888269 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:46:38.888237 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2e1d285_ef28_46cc_bdca_484abff6dcb4.slice/crio-6661ee6b25db7063a518ab044326b2a6bbcaa5f68eca706be30369f657f729de WatchSource:0}: Error finding container 6661ee6b25db7063a518ab044326b2a6bbcaa5f68eca706be30369f657f729de: Status 404 returned error can't find the container with id 6661ee6b25db7063a518ab044326b2a6bbcaa5f68eca706be30369f657f729de Apr 21 15:46:39.382704 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:39.382655 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-4snmx" event={"ID":"f2e1d285-ef28-46cc-bdca-484abff6dcb4","Type":"ContainerStarted","Data":"6661ee6b25db7063a518ab044326b2a6bbcaa5f68eca706be30369f657f729de"} Apr 21 15:46:40.387171 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:40.387135 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-4snmx" event={"ID":"f2e1d285-ef28-46cc-bdca-484abff6dcb4","Type":"ContainerStarted","Data":"4f85ddbad65369cdca9708a36f35a64480c3d0aa70b6aa69362548d2cb5e0c35"} Apr 21 15:46:40.402824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:40.402776 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-4snmx" podStartSLOduration=1.62355649 podStartE2EDuration="2.402763209s" podCreationTimestamp="2026-04-21 15:46:38 +0000 UTC" firstStartedPulling="2026-04-21 15:46:38.889607997 +0000 UTC m=+674.461640918" lastFinishedPulling="2026-04-21 15:46:39.668814716 +0000 UTC m=+675.240847637" observedRunningTime="2026-04-21 15:46:40.401144625 +0000 UTC m=+675.973177570" watchObservedRunningTime="2026-04-21 15:46:40.402763209 +0000 UTC m=+675.974796151" Apr 21 15:46:40.429614 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:40.429587 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-bgwt5"] Apr 21 15:46:40.429803 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:40.429782 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-bgwt5" podUID="ebe6bb1e-1011-4684-a3b3-6972074d6d8b" containerName="authorino" containerID="cri-o://c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f" gracePeriod=30 Apr 21 15:46:40.670755 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:40.670731 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-bgwt5" Apr 21 15:46:40.789099 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:40.789065 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqrjq\" (UniqueName: \"kubernetes.io/projected/ebe6bb1e-1011-4684-a3b3-6972074d6d8b-kube-api-access-zqrjq\") pod \"ebe6bb1e-1011-4684-a3b3-6972074d6d8b\" (UID: \"ebe6bb1e-1011-4684-a3b3-6972074d6d8b\") " Apr 21 15:46:40.791096 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:40.791068 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe6bb1e-1011-4684-a3b3-6972074d6d8b-kube-api-access-zqrjq" (OuterVolumeSpecName: "kube-api-access-zqrjq") pod "ebe6bb1e-1011-4684-a3b3-6972074d6d8b" (UID: "ebe6bb1e-1011-4684-a3b3-6972074d6d8b"). InnerVolumeSpecName "kube-api-access-zqrjq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:46:40.890247 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:40.890210 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zqrjq\" (UniqueName: \"kubernetes.io/projected/ebe6bb1e-1011-4684-a3b3-6972074d6d8b-kube-api-access-zqrjq\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:46:41.391516 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:41.391402 2569 generic.go:358] "Generic (PLEG): container finished" podID="ebe6bb1e-1011-4684-a3b3-6972074d6d8b" containerID="c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f" exitCode=0 Apr 21 15:46:41.391516 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:41.391463 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-bgwt5" Apr 21 15:46:41.391516 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:41.391506 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-bgwt5" event={"ID":"ebe6bb1e-1011-4684-a3b3-6972074d6d8b","Type":"ContainerDied","Data":"c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f"} Apr 21 15:46:41.392013 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:41.391543 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-bgwt5" event={"ID":"ebe6bb1e-1011-4684-a3b3-6972074d6d8b","Type":"ContainerDied","Data":"ae43ed395f223757d8272df2b1abe9cd824fe4436110a5f21e737a5bb725a20c"} Apr 21 15:46:41.392013 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:41.391557 2569 scope.go:117] "RemoveContainer" containerID="c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f" Apr 21 15:46:41.399846 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:41.399826 2569 scope.go:117] "RemoveContainer" containerID="c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f" Apr 21 15:46:41.400108 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:46:41.400081 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f\": container with ID starting with c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f not found: ID does not exist" containerID="c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f" Apr 21 15:46:41.400211 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:41.400117 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f"} err="failed to get container status \"c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f\": rpc error: code = NotFound desc = could not find container \"c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f\": container with ID starting with c3fad469eb33d959edecc3175ccf6a4cf7426c03ae37d0ebf5453880069f764f not found: ID does not exist" Apr 21 15:46:41.407657 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:41.407633 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-bgwt5"] Apr 21 15:46:41.411109 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:41.411086 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-bgwt5"] Apr 21 15:46:43.055110 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:46:43.055076 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe6bb1e-1011-4684-a3b3-6972074d6d8b" path="/var/lib/kubelet/pods/ebe6bb1e-1011-4684-a3b3-6972074d6d8b/volumes" Apr 21 15:48:51.989460 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:51.989426 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm"] Apr 21 15:48:51.989910 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:51.989796 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebe6bb1e-1011-4684-a3b3-6972074d6d8b" containerName="authorino" Apr 21 15:48:51.989910 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:51.989807 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe6bb1e-1011-4684-a3b3-6972074d6d8b" containerName="authorino" Apr 21 15:48:51.989910 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:51.989863 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebe6bb1e-1011-4684-a3b3-6972074d6d8b" containerName="authorino" Apr 21 15:48:51.993089 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:51.993073 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:51.998024 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:51.997990 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 15:48:51.998024 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:51.998010 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lp6xp\"" Apr 21 15:48:51.998966 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:51.998945 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 21 15:48:51.999101 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:51.999023 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-6sk7f\"" Apr 21 15:48:51.999168 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:51.999121 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 21 15:48:52.008701 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.008669 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm"] Apr 21 15:48:52.105931 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.105890 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.105931 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.105940 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.106170 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.106006 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wz27\" (UniqueName: \"kubernetes.io/projected/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kube-api-access-8wz27\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.106170 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.106062 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.106170 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.106142 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.106281 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.106184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.207410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.207371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.207410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.207413 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.207666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.207448 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.207666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.207474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.207666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.207542 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wz27\" (UniqueName: \"kubernetes.io/projected/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kube-api-access-8wz27\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.207666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.207589 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.207863 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.207801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.207922 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.207893 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.207976 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.207894 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.207976 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.207944 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.210075 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.210059 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.219859 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.219829 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wz27\" (UniqueName: \"kubernetes.io/projected/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kube-api-access-8wz27\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.303246 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.303149 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:48:52.440050 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.440020 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm"] Apr 21 15:48:52.441493 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:48:52.441456 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05aec5cd_e962_44dd_aa90_cc9747fa28cd.slice/crio-25b71999f104d1ea2fe2d16e732b8dfb727fe6ebd80f1524fbf7468808389d24 WatchSource:0}: Error finding container 25b71999f104d1ea2fe2d16e732b8dfb727fe6ebd80f1524fbf7468808389d24: Status 404 returned error can't find the container with id 25b71999f104d1ea2fe2d16e732b8dfb727fe6ebd80f1524fbf7468808389d24 Apr 21 15:48:52.881006 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:52.880972 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" event={"ID":"05aec5cd-e962-44dd-aa90-cc9747fa28cd","Type":"ContainerStarted","Data":"25b71999f104d1ea2fe2d16e732b8dfb727fe6ebd80f1524fbf7468808389d24"} Apr 21 15:48:56.900671 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:56.900621 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" event={"ID":"05aec5cd-e962-44dd-aa90-cc9747fa28cd","Type":"ContainerStarted","Data":"876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1"} Apr 21 15:48:57.905932 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:57.905891 2569 generic.go:358] "Generic (PLEG): container finished" podID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerID="876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1" exitCode=0 Apr 21 15:48:57.906412 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:57.905980 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" event={"ID":"05aec5cd-e962-44dd-aa90-cc9747fa28cd","Type":"ContainerDied","Data":"876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1"} Apr 21 15:48:59.917953 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:48:59.917895 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" event={"ID":"05aec5cd-e962-44dd-aa90-cc9747fa28cd","Type":"ContainerStarted","Data":"30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8"} Apr 21 15:49:29.055284 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:29.055249 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" event={"ID":"05aec5cd-e962-44dd-aa90-cc9747fa28cd","Type":"ContainerStarted","Data":"f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013"} Apr 21 15:49:29.055705 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:29.055297 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:49:29.055870 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:29.055853 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:49:29.078650 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:29.078587 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" podStartSLOduration=2.104578475 podStartE2EDuration="38.078571035s" podCreationTimestamp="2026-04-21 15:48:51 +0000 UTC" firstStartedPulling="2026-04-21 15:48:52.443439042 +0000 UTC m=+808.015471966" lastFinishedPulling="2026-04-21 15:49:28.417431597 +0000 UTC m=+843.989464526" observedRunningTime="2026-04-21 15:49:29.075313388 +0000 UTC m=+844.647346357" watchObservedRunningTime="2026-04-21 15:49:29.078571035 +0000 UTC m=+844.650603978" Apr 21 15:49:32.304160 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:32.304122 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:49:32.304160 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:32.304172 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:49:42.305793 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:42.305762 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:49:42.306837 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:42.306818 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:49:43.473081 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:43.473046 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm"] Apr 21 15:49:44.108560 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:44.108521 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" podUID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerName="main" containerID="cri-o://30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8" gracePeriod=30 Apr 21 15:49:44.108560 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:44.108547 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" podUID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerName="tokenizer" containerID="cri-o://f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013" gracePeriod=30 Apr 21 15:49:45.114429 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.114396 2569 generic.go:358] "Generic (PLEG): container finished" podID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerID="30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8" exitCode=0 Apr 21 15:49:45.114811 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.114464 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" event={"ID":"05aec5cd-e962-44dd-aa90-cc9747fa28cd","Type":"ContainerDied","Data":"30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8"} Apr 21 15:49:45.348832 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.348807 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:49:45.499324 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.499291 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-uds\") pod \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " Apr 21 15:49:45.499324 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.499327 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-cache\") pod \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " Apr 21 15:49:45.499597 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.499361 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kserve-provision-location\") pod \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " Apr 21 15:49:45.499597 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.499390 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-tmp\") pod \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " Apr 21 15:49:45.499597 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.499408 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tls-certs\") pod \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " Apr 21 15:49:45.499597 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.499444 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wz27\" (UniqueName: \"kubernetes.io/projected/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kube-api-access-8wz27\") pod \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\" (UID: \"05aec5cd-e962-44dd-aa90-cc9747fa28cd\") " Apr 21 15:49:45.499794 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.499661 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "05aec5cd-e962-44dd-aa90-cc9747fa28cd" (UID: "05aec5cd-e962-44dd-aa90-cc9747fa28cd"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:49:45.499794 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.499683 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "05aec5cd-e962-44dd-aa90-cc9747fa28cd" (UID: "05aec5cd-e962-44dd-aa90-cc9747fa28cd"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:49:45.499865 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.499798 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "05aec5cd-e962-44dd-aa90-cc9747fa28cd" (UID: "05aec5cd-e962-44dd-aa90-cc9747fa28cd"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:49:45.499865 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.499814 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-uds\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:49:45.499865 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.499835 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:49:45.500120 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.500092 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "05aec5cd-e962-44dd-aa90-cc9747fa28cd" (UID: "05aec5cd-e962-44dd-aa90-cc9747fa28cd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:49:45.501751 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.501728 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "05aec5cd-e962-44dd-aa90-cc9747fa28cd" (UID: "05aec5cd-e962-44dd-aa90-cc9747fa28cd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:49:45.501807 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.501784 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kube-api-access-8wz27" (OuterVolumeSpecName: "kube-api-access-8wz27") pod "05aec5cd-e962-44dd-aa90-cc9747fa28cd" (UID: "05aec5cd-e962-44dd-aa90-cc9747fa28cd"). InnerVolumeSpecName "kube-api-access-8wz27". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:49:45.601283 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.601241 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8wz27\" (UniqueName: \"kubernetes.io/projected/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kube-api-access-8wz27\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:49:45.601283 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.601276 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:49:45.601283 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.601287 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tokenizer-tmp\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:49:45.601283 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:45.601297 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05aec5cd-e962-44dd-aa90-cc9747fa28cd-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:49:46.121540 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.121505 2569 generic.go:358] "Generic (PLEG): container finished" podID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerID="f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013" exitCode=0 Apr 21 15:49:46.121920 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.121582 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" event={"ID":"05aec5cd-e962-44dd-aa90-cc9747fa28cd","Type":"ContainerDied","Data":"f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013"} Apr 21 15:49:46.121920 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.121605 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" event={"ID":"05aec5cd-e962-44dd-aa90-cc9747fa28cd","Type":"ContainerDied","Data":"25b71999f104d1ea2fe2d16e732b8dfb727fe6ebd80f1524fbf7468808389d24"} Apr 21 15:49:46.121920 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.121619 2569 scope.go:117] "RemoveContainer" containerID="f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013" Apr 21 15:49:46.121920 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.121633 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm" Apr 21 15:49:46.131350 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.131322 2569 scope.go:117] "RemoveContainer" containerID="30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8" Apr 21 15:49:46.139246 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.139228 2569 scope.go:117] "RemoveContainer" containerID="876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1" Apr 21 15:49:46.145869 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.145845 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm"] Apr 21 15:49:46.147615 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.147514 2569 scope.go:117] "RemoveContainer" containerID="f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013" Apr 21 15:49:46.147887 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:49:46.147858 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013\": container with ID starting with f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013 not found: ID does not exist" containerID="f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013" Apr 21 15:49:46.148017 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.147897 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013"} err="failed to get container status \"f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013\": rpc error: code = NotFound desc = could not find container \"f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013\": container with ID starting with f311971831d280c08100aa4d7dba4619ab94c1856df91caa05057d9aaaaf1013 not found: ID does not exist" Apr 21 15:49:46.148017 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.147925 2569 scope.go:117] "RemoveContainer" containerID="30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8" Apr 21 15:49:46.148197 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:49:46.148174 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8\": container with ID starting with 30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8 not found: ID does not exist" containerID="30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8" Apr 21 15:49:46.148272 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.148205 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8"} err="failed to get container status \"30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8\": rpc error: code = NotFound desc = could not find container \"30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8\": container with ID starting with 30ce85de56643e7e1186a34c071dbeb3d5411ba1d14aaa701484af22309bf9e8 not found: ID does not exist" Apr 21 15:49:46.148272 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.148223 2569 scope.go:117] "RemoveContainer" containerID="876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1" Apr 21 15:49:46.148637 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:49:46.148608 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1\": container with ID starting with 876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1 not found: ID does not exist" containerID="876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1" Apr 21 15:49:46.148720 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.148643 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1"} err="failed to get container status \"876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1\": rpc error: code = NotFound desc = could not find container \"876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1\": container with ID starting with 876724ea5e7fc6bc329f9fb597e0e604e793dfe158957b7c0db0bf603f342da1 not found: ID does not exist" Apr 21 15:49:46.149709 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:46.149691 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-64d55lm4dm"] Apr 21 15:49:47.055242 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:47.055210 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" path="/var/lib/kubelet/pods/05aec5cd-e962-44dd-aa90-cc9747fa28cd/volumes" Apr 21 15:49:54.105362 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.105319 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh"] Apr 21 15:49:54.105935 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.105824 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerName="main" Apr 21 15:49:54.105935 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.105862 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerName="main" Apr 21 15:49:54.105935 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.105883 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerName="storage-initializer" Apr 21 15:49:54.105935 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.105892 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerName="storage-initializer" Apr 21 15:49:54.105935 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.105912 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerName="tokenizer" Apr 21 15:49:54.105935 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.105920 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerName="tokenizer" Apr 21 15:49:54.106249 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.106002 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerName="main" Apr 21 15:49:54.106249 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.106017 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="05aec5cd-e962-44dd-aa90-cc9747fa28cd" containerName="tokenizer" Apr 21 15:49:54.452971 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.452933 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh"] Apr 21 15:49:54.453128 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.453073 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.456942 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.456916 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 21 15:49:54.457105 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.457056 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lp6xp\"" Apr 21 15:49:54.457105 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.457077 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 21 15:49:54.457224 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.457107 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-wfr98\"" Apr 21 15:49:54.457408 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.457390 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 15:49:54.478390 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.478356 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfgms\" (UniqueName: \"kubernetes.io/projected/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kube-api-access-cfgms\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.478542 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.478405 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.478542 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.478432 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.478542 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.478490 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.478542 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.478533 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.478900 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.478591 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.579805 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.579774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfgms\" (UniqueName: \"kubernetes.io/projected/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kube-api-access-cfgms\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.579988 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.579813 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.579988 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.579834 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.579988 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.579875 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.579988 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.579916 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.579988 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.579972 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.580357 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.580333 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.580439 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.580361 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.580439 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.580406 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.580556 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.580461 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.582356 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.582334 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.591325 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.591296 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfgms\" (UniqueName: \"kubernetes.io/projected/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kube-api-access-cfgms\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.762866 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.762775 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:54.890017 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:54.889983 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh"] Apr 21 15:49:54.891841 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:49:54.891813 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod256d2aae_3180_48b3_a6b6_f0628fa12ec4.slice/crio-3eb4cbb6f04fe7c3e404613076677329345f9f860a32e301151c2bd692ab31d9 WatchSource:0}: Error finding container 3eb4cbb6f04fe7c3e404613076677329345f9f860a32e301151c2bd692ab31d9: Status 404 returned error can't find the container with id 3eb4cbb6f04fe7c3e404613076677329345f9f860a32e301151c2bd692ab31d9 Apr 21 15:49:55.155369 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:55.155329 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" event={"ID":"256d2aae-3180-48b3-a6b6-f0628fa12ec4","Type":"ContainerStarted","Data":"f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48"} Apr 21 15:49:55.155369 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:55.155372 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" event={"ID":"256d2aae-3180-48b3-a6b6-f0628fa12ec4","Type":"ContainerStarted","Data":"3eb4cbb6f04fe7c3e404613076677329345f9f860a32e301151c2bd692ab31d9"} Apr 21 15:49:56.160113 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:56.160022 2569 generic.go:358] "Generic (PLEG): container finished" podID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerID="f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48" exitCode=0 Apr 21 15:49:56.160473 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:56.160113 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" event={"ID":"256d2aae-3180-48b3-a6b6-f0628fa12ec4","Type":"ContainerDied","Data":"f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48"} Apr 21 15:49:57.165907 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:57.165871 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" event={"ID":"256d2aae-3180-48b3-a6b6-f0628fa12ec4","Type":"ContainerStarted","Data":"b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086"} Apr 21 15:49:57.165907 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:57.165908 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" event={"ID":"256d2aae-3180-48b3-a6b6-f0628fa12ec4","Type":"ContainerStarted","Data":"9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8"} Apr 21 15:49:57.166354 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:57.166019 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:49:57.189628 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:49:57.189579 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" podStartSLOduration=3.189564024 podStartE2EDuration="3.189564024s" podCreationTimestamp="2026-04-21 15:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:49:57.186537102 +0000 UTC m=+872.758570043" watchObservedRunningTime="2026-04-21 15:49:57.189564024 +0000 UTC m=+872.761596967" Apr 21 15:50:04.762978 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:04.762937 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:50:04.763533 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:04.763094 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:50:04.766026 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:04.766000 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:50:05.199506 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:05.199454 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:50:24.991236 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:24.991205 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:50:24.991813 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:24.991712 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:50:27.206562 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:27.206531 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:50:28.806007 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:28.805968 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh"] Apr 21 15:50:28.806461 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:28.806406 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" podUID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerName="main" containerID="cri-o://9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8" gracePeriod=30 Apr 21 15:50:28.806571 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:28.806465 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" podUID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerName="tokenizer" containerID="cri-o://b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086" gracePeriod=30 Apr 21 15:50:29.288496 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:29.288438 2569 generic.go:358] "Generic (PLEG): container finished" podID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerID="9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8" exitCode=0 Apr 21 15:50:29.288658 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:29.288512 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" event={"ID":"256d2aae-3180-48b3-a6b6-f0628fa12ec4","Type":"ContainerDied","Data":"9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8"} Apr 21 15:50:30.048319 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.048295 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:50:30.200343 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.200311 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-cache\") pod \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " Apr 21 15:50:30.200502 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.200355 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfgms\" (UniqueName: \"kubernetes.io/projected/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kube-api-access-cfgms\") pod \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " Apr 21 15:50:30.200502 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.200382 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-uds\") pod \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " Apr 21 15:50:30.200502 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.200414 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tls-certs\") pod \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " Apr 21 15:50:30.200502 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.200450 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-tmp\") pod \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " Apr 21 15:50:30.200502 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.200471 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kserve-provision-location\") pod \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\" (UID: \"256d2aae-3180-48b3-a6b6-f0628fa12ec4\") " Apr 21 15:50:30.200772 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.200630 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "256d2aae-3180-48b3-a6b6-f0628fa12ec4" (UID: "256d2aae-3180-48b3-a6b6-f0628fa12ec4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:50:30.200856 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.200830 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:50:30.200917 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.200842 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "256d2aae-3180-48b3-a6b6-f0628fa12ec4" (UID: "256d2aae-3180-48b3-a6b6-f0628fa12ec4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:50:30.200917 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.200898 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "256d2aae-3180-48b3-a6b6-f0628fa12ec4" (UID: "256d2aae-3180-48b3-a6b6-f0628fa12ec4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:50:30.201249 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.201229 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "256d2aae-3180-48b3-a6b6-f0628fa12ec4" (UID: "256d2aae-3180-48b3-a6b6-f0628fa12ec4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:50:30.202581 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.202560 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kube-api-access-cfgms" (OuterVolumeSpecName: "kube-api-access-cfgms") pod "256d2aae-3180-48b3-a6b6-f0628fa12ec4" (UID: "256d2aae-3180-48b3-a6b6-f0628fa12ec4"). InnerVolumeSpecName "kube-api-access-cfgms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:50:30.202658 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.202590 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "256d2aae-3180-48b3-a6b6-f0628fa12ec4" (UID: "256d2aae-3180-48b3-a6b6-f0628fa12ec4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:50:30.294183 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.294147 2569 generic.go:358] "Generic (PLEG): container finished" podID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerID="b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086" exitCode=0 Apr 21 15:50:30.294376 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.294180 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" event={"ID":"256d2aae-3180-48b3-a6b6-f0628fa12ec4","Type":"ContainerDied","Data":"b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086"} Apr 21 15:50:30.294376 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.294216 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" Apr 21 15:50:30.294376 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.294229 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh" event={"ID":"256d2aae-3180-48b3-a6b6-f0628fa12ec4","Type":"ContainerDied","Data":"3eb4cbb6f04fe7c3e404613076677329345f9f860a32e301151c2bd692ab31d9"} Apr 21 15:50:30.294376 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.294245 2569 scope.go:117] "RemoveContainer" containerID="b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086" Apr 21 15:50:30.301900 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.301874 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-tmp\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:50:30.302018 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.301904 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:50:30.302018 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.301920 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cfgms\" (UniqueName: \"kubernetes.io/projected/256d2aae-3180-48b3-a6b6-f0628fa12ec4-kube-api-access-cfgms\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:50:30.302018 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.301933 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tokenizer-uds\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:50:30.302018 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.301946 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/256d2aae-3180-48b3-a6b6-f0628fa12ec4-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:50:30.302924 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.302903 2569 scope.go:117] "RemoveContainer" containerID="9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8" Apr 21 15:50:30.310552 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.310535 2569 scope.go:117] "RemoveContainer" containerID="f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48" Apr 21 15:50:30.317806 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.317776 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh"] Apr 21 15:50:30.318450 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.318434 2569 scope.go:117] "RemoveContainer" containerID="b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086" Apr 21 15:50:30.318731 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:50:30.318713 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086\": container with ID starting with b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086 not found: ID does not exist" containerID="b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086" Apr 21 15:50:30.318800 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.318738 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086"} err="failed to get container status \"b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086\": rpc error: code = NotFound desc = could not find container \"b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086\": container with ID starting with b1ec9bd9f7265947efc558a2c11ef8c443a483ad1d8e288d2052d6a0b231e086 not found: ID does not exist" Apr 21 15:50:30.318800 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.318765 2569 scope.go:117] "RemoveContainer" containerID="9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8" Apr 21 15:50:30.319116 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:50:30.319088 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8\": container with ID starting with 9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8 not found: ID does not exist" containerID="9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8" Apr 21 15:50:30.319222 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.319124 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8"} err="failed to get container status \"9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8\": rpc error: code = NotFound desc = could not find container \"9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8\": container with ID starting with 9053321508ed94576fef223ec0f410cf5ab66a343ed07659a878ce0ee69cddf8 not found: ID does not exist" Apr 21 15:50:30.319222 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.319150 2569 scope.go:117] "RemoveContainer" containerID="f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48" Apr 21 15:50:30.319448 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:50:30.319424 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48\": container with ID starting with f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48 not found: ID does not exist" containerID="f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48" Apr 21 15:50:30.319552 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.319455 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48"} err="failed to get container status \"f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48\": rpc error: code = NotFound desc = could not find container \"f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48\": container with ID starting with f63a83f315e7b2108d0bbb2c79aa398158b0ac66f30884a0bbcf575759a5cc48 not found: ID does not exist" Apr 21 15:50:30.321638 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:30.321618 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8htthh"] Apr 21 15:50:31.054668 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:31.054632 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" path="/var/lib/kubelet/pods/256d2aae-3180-48b3-a6b6-f0628fa12ec4/volumes" Apr 21 15:50:41.274758 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.274722 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc"] Apr 21 15:50:41.275263 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.275060 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerName="storage-initializer" Apr 21 15:50:41.275263 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.275070 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerName="storage-initializer" Apr 21 15:50:41.275263 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.275085 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerName="main" Apr 21 15:50:41.275263 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.275093 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerName="main" Apr 21 15:50:41.275263 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.275114 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerName="tokenizer" Apr 21 15:50:41.275263 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.275122 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerName="tokenizer" Apr 21 15:50:41.275263 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.275198 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerName="tokenizer" Apr 21 15:50:41.275263 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.275210 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="256d2aae-3180-48b3-a6b6-f0628fa12ec4" containerName="main" Apr 21 15:50:41.279889 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.279863 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.282426 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.282401 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 15:50:41.282582 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.282401 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lp6xp\"" Apr 21 15:50:41.283607 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.283585 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 21 15:50:41.284564 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.284540 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 21 15:50:41.287893 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.287867 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc"] Apr 21 15:50:41.290518 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.290496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-home\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.290630 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.290553 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac2b6778-f5e1-4a7d-b8b8-711d30992507-tls-certs\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.290691 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.290639 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-model-cache\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.290750 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.290694 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.290750 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.290719 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr49n\" (UniqueName: \"kubernetes.io/projected/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kube-api-access-jr49n\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.290868 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.290769 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-dshm\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.391864 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.391822 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac2b6778-f5e1-4a7d-b8b8-711d30992507-tls-certs\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.392042 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.391877 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-model-cache\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.392042 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.391915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.392042 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.391934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr49n\" (UniqueName: \"kubernetes.io/projected/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kube-api-access-jr49n\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.392042 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.391954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-dshm\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.392042 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.391984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-home\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.392381 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.392357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-model-cache\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.392436 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.392390 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.392436 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.392403 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-home\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.394195 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.394169 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-dshm\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.394377 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.394358 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac2b6778-f5e1-4a7d-b8b8-711d30992507-tls-certs\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.399878 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.399856 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr49n\" (UniqueName: \"kubernetes.io/projected/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kube-api-access-jr49n\") pod \"precise-prefix-cache-test-kserve-7b7596699d-29gmc\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.506763 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.506729 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j"] Apr 21 15:50:41.510548 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.510530 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.513294 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.513270 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-ncpmb\"" Apr 21 15:50:41.524511 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.524464 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j"] Apr 21 15:50:41.592548 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.592440 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:41.593056 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.593014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.593179 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.593060 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.593179 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.593132 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.593179 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.593172 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pppb\" (UniqueName: \"kubernetes.io/projected/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kube-api-access-2pppb\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.593650 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.593214 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.593650 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.593287 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.694494 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.694446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.694666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.694510 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.694666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.694532 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.694666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.694560 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.694666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.694595 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pppb\" (UniqueName: \"kubernetes.io/projected/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kube-api-access-2pppb\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.694666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.694636 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.695025 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.694966 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.695372 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.695350 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.695687 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.695378 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.695687 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.695423 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.697261 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.697239 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.703414 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.703385 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pppb\" (UniqueName: \"kubernetes.io/projected/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kube-api-access-2pppb\") pod \"precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.718623 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.718599 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc"] Apr 21 15:50:41.720606 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:50:41.720579 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac2b6778_f5e1_4a7d_b8b8_711d30992507.slice/crio-a94f0b18a933a0297ef3f4811f4d6b30143c9e2d01912a3663904b639b826a30 WatchSource:0}: Error finding container a94f0b18a933a0297ef3f4811f4d6b30143c9e2d01912a3663904b639b826a30: Status 404 returned error can't find the container with id a94f0b18a933a0297ef3f4811f4d6b30143c9e2d01912a3663904b639b826a30 Apr 21 15:50:41.820439 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.820402 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:41.955992 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:41.955960 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j"] Apr 21 15:50:41.957679 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:50:41.957654 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a3ed068_e7ca_4c4e_8258_8f1c88153eee.slice/crio-5bf1f5b8906e2957f9d8aee6e4cd182466bf7b309f65028a40c3931299546f68 WatchSource:0}: Error finding container 5bf1f5b8906e2957f9d8aee6e4cd182466bf7b309f65028a40c3931299546f68: Status 404 returned error can't find the container with id 5bf1f5b8906e2957f9d8aee6e4cd182466bf7b309f65028a40c3931299546f68 Apr 21 15:50:42.340869 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:42.340826 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" event={"ID":"0a3ed068-e7ca-4c4e-8258-8f1c88153eee","Type":"ContainerStarted","Data":"6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31"} Apr 21 15:50:42.340869 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:42.340878 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" event={"ID":"0a3ed068-e7ca-4c4e-8258-8f1c88153eee","Type":"ContainerStarted","Data":"5bf1f5b8906e2957f9d8aee6e4cd182466bf7b309f65028a40c3931299546f68"} Apr 21 15:50:42.342119 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:42.342094 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" event={"ID":"ac2b6778-f5e1-4a7d-b8b8-711d30992507","Type":"ContainerStarted","Data":"e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5"} Apr 21 15:50:42.342119 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:42.342125 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" event={"ID":"ac2b6778-f5e1-4a7d-b8b8-711d30992507","Type":"ContainerStarted","Data":"a94f0b18a933a0297ef3f4811f4d6b30143c9e2d01912a3663904b639b826a30"} Apr 21 15:50:43.348592 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:43.348555 2569 generic.go:358] "Generic (PLEG): container finished" podID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerID="6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31" exitCode=0 Apr 21 15:50:43.349056 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:43.348649 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" event={"ID":"0a3ed068-e7ca-4c4e-8258-8f1c88153eee","Type":"ContainerDied","Data":"6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31"} Apr 21 15:50:44.357170 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:44.357137 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" event={"ID":"0a3ed068-e7ca-4c4e-8258-8f1c88153eee","Type":"ContainerStarted","Data":"bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7"} Apr 21 15:50:44.357170 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:44.357176 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" event={"ID":"0a3ed068-e7ca-4c4e-8258-8f1c88153eee","Type":"ContainerStarted","Data":"a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44"} Apr 21 15:50:44.357647 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:44.357267 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:44.384457 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:44.384383 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" podStartSLOduration=3.384359281 podStartE2EDuration="3.384359281s" podCreationTimestamp="2026-04-21 15:50:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:50:44.379925794 +0000 UTC m=+919.951958736" watchObservedRunningTime="2026-04-21 15:50:44.384359281 +0000 UTC m=+919.956392278" Apr 21 15:50:46.365585 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:46.365544 2569 generic.go:358] "Generic (PLEG): container finished" podID="ac2b6778-f5e1-4a7d-b8b8-711d30992507" containerID="e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5" exitCode=0 Apr 21 15:50:46.366035 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:46.365616 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" event={"ID":"ac2b6778-f5e1-4a7d-b8b8-711d30992507","Type":"ContainerDied","Data":"e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5"} Apr 21 15:50:48.376576 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:48.376540 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" event={"ID":"ac2b6778-f5e1-4a7d-b8b8-711d30992507","Type":"ContainerStarted","Data":"30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f"} Apr 21 15:50:48.398449 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:48.398394 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" podStartSLOduration=6.295232061 podStartE2EDuration="7.39838027s" podCreationTimestamp="2026-04-21 15:50:41 +0000 UTC" firstStartedPulling="2026-04-21 15:50:46.366755135 +0000 UTC m=+921.938788059" lastFinishedPulling="2026-04-21 15:50:47.469903342 +0000 UTC m=+923.041936268" observedRunningTime="2026-04-21 15:50:48.396792099 +0000 UTC m=+923.968825042" watchObservedRunningTime="2026-04-21 15:50:48.39838027 +0000 UTC m=+923.970413227" Apr 21 15:50:51.592914 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:51.592859 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:51.593297 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:51.592938 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:51.605372 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:51.605333 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:51.821350 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:51.821312 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:51.821350 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:51.821356 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:51.823641 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:50:51.823616 2569 logging.go:55] [core] [Channel #41 SubChannel #42]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.46:9003", ServerName: "10.133.0.46:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.46:9003: connect: connection refused" Apr 21 15:50:51.823945 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:51.823926 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:52.395102 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:52.395072 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:50:52.405038 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:52.405009 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:50:52.821918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:50:52.821860 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.46:9003\" within 1s: context deadline exceeded" Apr 21 15:51:01.822064 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:51:01.822029 2569 logging.go:55] [core] [Channel #43 SubChannel #44]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.46:9003", ServerName: "10.133.0.46:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.46:9003: connect: connection refused" Apr 21 15:51:02.821710 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:02.821662 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.46:9003\" within 1s: context deadline exceeded" Apr 21 15:51:13.399316 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:13.399285 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:51:14.488194 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.488161 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j"] Apr 21 15:51:14.488619 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.488473 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="main" containerID="cri-o://a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44" gracePeriod=30 Apr 21 15:51:14.488619 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.488558 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="tokenizer" containerID="cri-o://bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7" gracePeriod=30 Apr 21 15:51:14.497796 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.497766 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc"] Apr 21 15:51:14.498119 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.498090 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" podUID="ac2b6778-f5e1-4a7d-b8b8-711d30992507" containerName="main" containerID="cri-o://30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f" gracePeriod=30 Apr 21 15:51:14.753258 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.753228 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:51:14.904344 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.904304 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kserve-provision-location\") pod \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " Apr 21 15:51:14.904559 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.904364 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-model-cache\") pod \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " Apr 21 15:51:14.904559 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.904402 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac2b6778-f5e1-4a7d-b8b8-711d30992507-tls-certs\") pod \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " Apr 21 15:51:14.904559 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.904441 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-home\") pod \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " Apr 21 15:51:14.904559 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.904461 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr49n\" (UniqueName: \"kubernetes.io/projected/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kube-api-access-jr49n\") pod \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " Apr 21 15:51:14.904559 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.904516 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-dshm\") pod \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\" (UID: \"ac2b6778-f5e1-4a7d-b8b8-711d30992507\") " Apr 21 15:51:14.904804 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.904690 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-model-cache" (OuterVolumeSpecName: "model-cache") pod "ac2b6778-f5e1-4a7d-b8b8-711d30992507" (UID: "ac2b6778-f5e1-4a7d-b8b8-711d30992507"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:14.904804 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.904786 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-model-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:14.904986 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.904958 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-home" (OuterVolumeSpecName: "home") pod "ac2b6778-f5e1-4a7d-b8b8-711d30992507" (UID: "ac2b6778-f5e1-4a7d-b8b8-711d30992507"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:14.906707 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.906681 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac2b6778-f5e1-4a7d-b8b8-711d30992507-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ac2b6778-f5e1-4a7d-b8b8-711d30992507" (UID: "ac2b6778-f5e1-4a7d-b8b8-711d30992507"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:51:14.907020 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.907001 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-dshm" (OuterVolumeSpecName: "dshm") pod "ac2b6778-f5e1-4a7d-b8b8-711d30992507" (UID: "ac2b6778-f5e1-4a7d-b8b8-711d30992507"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:14.907090 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.907068 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kube-api-access-jr49n" (OuterVolumeSpecName: "kube-api-access-jr49n") pod "ac2b6778-f5e1-4a7d-b8b8-711d30992507" (UID: "ac2b6778-f5e1-4a7d-b8b8-711d30992507"). InnerVolumeSpecName "kube-api-access-jr49n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:51:14.958896 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:14.958828 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ac2b6778-f5e1-4a7d-b8b8-711d30992507" (UID: "ac2b6778-f5e1-4a7d-b8b8-711d30992507"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:15.005755 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.005674 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:15.005755 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.005710 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac2b6778-f5e1-4a7d-b8b8-711d30992507-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:15.005755 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.005719 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-home\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:15.005755 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.005728 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jr49n\" (UniqueName: \"kubernetes.io/projected/ac2b6778-f5e1-4a7d-b8b8-711d30992507-kube-api-access-jr49n\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:15.005755 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.005740 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ac2b6778-f5e1-4a7d-b8b8-711d30992507-dshm\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:15.487040 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.487001 2569 generic.go:358] "Generic (PLEG): container finished" podID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerID="a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44" exitCode=0 Apr 21 15:51:15.487215 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.487070 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" event={"ID":"0a3ed068-e7ca-4c4e-8258-8f1c88153eee","Type":"ContainerDied","Data":"a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44"} Apr 21 15:51:15.488425 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.488402 2569 generic.go:358] "Generic (PLEG): container finished" podID="ac2b6778-f5e1-4a7d-b8b8-711d30992507" containerID="30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f" exitCode=0 Apr 21 15:51:15.488802 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.488475 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" event={"ID":"ac2b6778-f5e1-4a7d-b8b8-711d30992507","Type":"ContainerDied","Data":"30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f"} Apr 21 15:51:15.488802 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.488527 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" event={"ID":"ac2b6778-f5e1-4a7d-b8b8-711d30992507","Type":"ContainerDied","Data":"a94f0b18a933a0297ef3f4811f4d6b30143c9e2d01912a3663904b639b826a30"} Apr 21 15:51:15.488802 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.488545 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc" Apr 21 15:51:15.488802 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.488550 2569 scope.go:117] "RemoveContainer" containerID="30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f" Apr 21 15:51:15.497596 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.497536 2569 scope.go:117] "RemoveContainer" containerID="e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5" Apr 21 15:51:15.510452 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.510417 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc"] Apr 21 15:51:15.513066 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.513041 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-29gmc"] Apr 21 15:51:15.570472 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.570449 2569 scope.go:117] "RemoveContainer" containerID="30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f" Apr 21 15:51:15.570801 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:51:15.570780 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f\": container with ID starting with 30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f not found: ID does not exist" containerID="30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f" Apr 21 15:51:15.570866 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.570811 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f"} err="failed to get container status \"30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f\": rpc error: code = NotFound desc = could not find container \"30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f\": container with ID starting with 30c3bae5c3b99e38410ac93561485da641a47cdbe377c8a96107ad1cf1e8a89f not found: ID does not exist" Apr 21 15:51:15.570866 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.570831 2569 scope.go:117] "RemoveContainer" containerID="e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5" Apr 21 15:51:15.571128 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:51:15.571106 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5\": container with ID starting with e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5 not found: ID does not exist" containerID="e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5" Apr 21 15:51:15.571178 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:15.571137 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5"} err="failed to get container status \"e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5\": rpc error: code = NotFound desc = could not find container \"e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5\": container with ID starting with e6c4e0cfb53de4a2840e041522061f56d5acb05d8df5cd239515acb18d5cfaa5 not found: ID does not exist" Apr 21 15:51:16.042555 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.042531 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:51:16.215370 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.215343 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-cache\") pod \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " Apr 21 15:51:16.215565 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.215400 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-tmp\") pod \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " Apr 21 15:51:16.215565 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.215449 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-uds\") pod \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " Apr 21 15:51:16.215565 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.215509 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tls-certs\") pod \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " Apr 21 15:51:16.215565 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.215536 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kserve-provision-location\") pod \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " Apr 21 15:51:16.215565 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.215552 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pppb\" (UniqueName: \"kubernetes.io/projected/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kube-api-access-2pppb\") pod \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\" (UID: \"0a3ed068-e7ca-4c4e-8258-8f1c88153eee\") " Apr 21 15:51:16.215838 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.215707 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0a3ed068-e7ca-4c4e-8258-8f1c88153eee" (UID: "0a3ed068-e7ca-4c4e-8258-8f1c88153eee"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:16.215897 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.215831 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0a3ed068-e7ca-4c4e-8258-8f1c88153eee" (UID: "0a3ed068-e7ca-4c4e-8258-8f1c88153eee"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:16.215897 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.215841 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0a3ed068-e7ca-4c4e-8258-8f1c88153eee" (UID: "0a3ed068-e7ca-4c4e-8258-8f1c88153eee"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:16.216316 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.216290 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0a3ed068-e7ca-4c4e-8258-8f1c88153eee" (UID: "0a3ed068-e7ca-4c4e-8258-8f1c88153eee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:16.217739 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.217715 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0a3ed068-e7ca-4c4e-8258-8f1c88153eee" (UID: "0a3ed068-e7ca-4c4e-8258-8f1c88153eee"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:51:16.217811 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.217717 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kube-api-access-2pppb" (OuterVolumeSpecName: "kube-api-access-2pppb") pod "0a3ed068-e7ca-4c4e-8258-8f1c88153eee" (UID: "0a3ed068-e7ca-4c4e-8258-8f1c88153eee"). InnerVolumeSpecName "kube-api-access-2pppb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:51:16.316468 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.316422 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-uds\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:16.316468 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.316502 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:16.316702 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.316514 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:16.316702 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.316524 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2pppb\" (UniqueName: \"kubernetes.io/projected/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-kube-api-access-2pppb\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:16.316702 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.316534 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:16.316702 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.316542 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a3ed068-e7ca-4c4e-8258-8f1c88153eee-tokenizer-tmp\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:51:16.494958 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.494861 2569 generic.go:358] "Generic (PLEG): container finished" podID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerID="bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7" exitCode=0 Apr 21 15:51:16.494958 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.494915 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" event={"ID":"0a3ed068-e7ca-4c4e-8258-8f1c88153eee","Type":"ContainerDied","Data":"bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7"} Apr 21 15:51:16.494958 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.494935 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" Apr 21 15:51:16.495546 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.494963 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j" event={"ID":"0a3ed068-e7ca-4c4e-8258-8f1c88153eee","Type":"ContainerDied","Data":"5bf1f5b8906e2957f9d8aee6e4cd182466bf7b309f65028a40c3931299546f68"} Apr 21 15:51:16.495546 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.494983 2569 scope.go:117] "RemoveContainer" containerID="bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7" Apr 21 15:51:16.503501 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.503463 2569 scope.go:117] "RemoveContainer" containerID="a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44" Apr 21 15:51:16.511098 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.511081 2569 scope.go:117] "RemoveContainer" containerID="6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31" Apr 21 15:51:16.519102 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.519082 2569 scope.go:117] "RemoveContainer" containerID="bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7" Apr 21 15:51:16.519375 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:51:16.519356 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7\": container with ID starting with bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7 not found: ID does not exist" containerID="bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7" Apr 21 15:51:16.519414 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.519383 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7"} err="failed to get container status \"bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7\": rpc error: code = NotFound desc = could not find container \"bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7\": container with ID starting with bee37398beeafef94a62bd8a94662c373c74c4a7368d0f23edef35335f9bf7f7 not found: ID does not exist" Apr 21 15:51:16.519414 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.519402 2569 scope.go:117] "RemoveContainer" containerID="a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44" Apr 21 15:51:16.519660 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:51:16.519643 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44\": container with ID starting with a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44 not found: ID does not exist" containerID="a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44" Apr 21 15:51:16.519706 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.519668 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44"} err="failed to get container status \"a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44\": rpc error: code = NotFound desc = could not find container \"a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44\": container with ID starting with a68bc2f631a41132d873b06974a6280ddbc2bad9dd3f9d0bc78a6e4dcadafb44 not found: ID does not exist" Apr 21 15:51:16.519706 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.519683 2569 scope.go:117] "RemoveContainer" containerID="6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31" Apr 21 15:51:16.519880 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:51:16.519864 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31\": container with ID starting with 6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31 not found: ID does not exist" containerID="6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31" Apr 21 15:51:16.519929 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.519882 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31"} err="failed to get container status \"6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31\": rpc error: code = NotFound desc = could not find container \"6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31\": container with ID starting with 6e2387877f595513eceb9a2f482ea5dd0d6a18661c8326385cf8f2f65b45ae31 not found: ID does not exist" Apr 21 15:51:16.523507 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.523472 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j"] Apr 21 15:51:16.527094 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:16.527075 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-686c5578fq47j"] Apr 21 15:51:17.055091 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.055055 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" path="/var/lib/kubelet/pods/0a3ed068-e7ca-4c4e-8258-8f1c88153eee/volumes" Apr 21 15:51:17.055562 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.055547 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac2b6778-f5e1-4a7d-b8b8-711d30992507" path="/var/lib/kubelet/pods/ac2b6778-f5e1-4a7d-b8b8-711d30992507/volumes" Apr 21 15:51:17.676438 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676401 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr"] Apr 21 15:51:17.676824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676749 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="storage-initializer" Apr 21 15:51:17.676824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676761 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="storage-initializer" Apr 21 15:51:17.676824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676769 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac2b6778-f5e1-4a7d-b8b8-711d30992507" containerName="main" Apr 21 15:51:17.676824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676774 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2b6778-f5e1-4a7d-b8b8-711d30992507" containerName="main" Apr 21 15:51:17.676824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676786 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac2b6778-f5e1-4a7d-b8b8-711d30992507" containerName="storage-initializer" Apr 21 15:51:17.676824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676792 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2b6778-f5e1-4a7d-b8b8-711d30992507" containerName="storage-initializer" Apr 21 15:51:17.676824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676803 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="tokenizer" Apr 21 15:51:17.676824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676808 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="tokenizer" Apr 21 15:51:17.676824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676816 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="main" Apr 21 15:51:17.676824 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676820 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="main" Apr 21 15:51:17.677138 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676876 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="main" Apr 21 15:51:17.677138 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676885 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a3ed068-e7ca-4c4e-8258-8f1c88153eee" containerName="tokenizer" Apr 21 15:51:17.677138 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.676891 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac2b6778-f5e1-4a7d-b8b8-711d30992507" containerName="main" Apr 21 15:51:17.679815 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.679799 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.682459 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.682438 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 21 15:51:17.682615 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.682474 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 15:51:17.682615 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.682580 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lp6xp\"" Apr 21 15:51:17.683238 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.683224 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-v1a2-to-v1a1-kserve-self-signed-certs\"" Apr 21 15:51:17.690348 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.690327 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr"] Apr 21 15:51:17.831859 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.831818 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-model-cache\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.832038 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.831863 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e34df173-59f1-44e3-89c7-cc8dac69ea0d-tls-certs\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.832038 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.831971 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-dshm\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.832038 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.831994 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kserve-provision-location\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.832038 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.832014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-home\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.832231 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.832111 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdqlf\" (UniqueName: \"kubernetes.io/projected/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kube-api-access-gdqlf\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.933368 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.933271 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-dshm\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.933368 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.933319 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kserve-provision-location\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.933368 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.933347 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-home\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.933635 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.933386 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdqlf\" (UniqueName: \"kubernetes.io/projected/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kube-api-access-gdqlf\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.933635 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.933458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-model-cache\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.933635 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.933516 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e34df173-59f1-44e3-89c7-cc8dac69ea0d-tls-certs\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.933782 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.933757 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kserve-provision-location\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.933836 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.933786 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-home\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.933936 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.933914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-model-cache\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.935686 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.935657 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-dshm\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.935984 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.935968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e34df173-59f1-44e3-89c7-cc8dac69ea0d-tls-certs\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.941194 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.941174 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdqlf\" (UniqueName: \"kubernetes.io/projected/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kube-api-access-gdqlf\") pod \"conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:17.990193 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:17.990150 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:51:18.127020 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:18.126992 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr"] Apr 21 15:51:18.128939 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:51:18.128905 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode34df173_59f1_44e3_89c7_cc8dac69ea0d.slice/crio-601e2e31df68424c949d8e38f82acbc5b2ac394cea17e39e71b2aa451346fb63 WatchSource:0}: Error finding container 601e2e31df68424c949d8e38f82acbc5b2ac394cea17e39e71b2aa451346fb63: Status 404 returned error can't find the container with id 601e2e31df68424c949d8e38f82acbc5b2ac394cea17e39e71b2aa451346fb63 Apr 21 15:51:18.131212 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:18.131195 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:51:18.506412 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:18.506325 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" event={"ID":"e34df173-59f1-44e3-89c7-cc8dac69ea0d","Type":"ContainerStarted","Data":"3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975"} Apr 21 15:51:18.506412 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:18.506363 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" event={"ID":"e34df173-59f1-44e3-89c7-cc8dac69ea0d","Type":"ContainerStarted","Data":"601e2e31df68424c949d8e38f82acbc5b2ac394cea17e39e71b2aa451346fb63"} Apr 21 15:51:22.534993 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:22.534954 2569 generic.go:358] "Generic (PLEG): container finished" podID="e34df173-59f1-44e3-89c7-cc8dac69ea0d" containerID="3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975" exitCode=0 Apr 21 15:51:22.535425 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:22.535023 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" event={"ID":"e34df173-59f1-44e3-89c7-cc8dac69ea0d","Type":"ContainerDied","Data":"3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975"} Apr 21 15:51:26.298517 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.298462 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29"] Apr 21 15:51:26.302430 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.302404 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.306031 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.305997 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-qmjlw\"" Apr 21 15:51:26.306726 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.306702 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 21 15:51:26.321656 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.321624 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29"] Apr 21 15:51:26.406247 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.406204 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.406427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.406256 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.406427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.406283 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.406427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.406335 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.406427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.406400 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twhs8\" (UniqueName: \"kubernetes.io/projected/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kube-api-access-twhs8\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.406587 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.406450 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.507689 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.507642 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.507886 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.507704 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.507886 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.507730 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.507886 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.507747 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.507886 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.507768 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.507886 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.507809 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twhs8\" (UniqueName: \"kubernetes.io/projected/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kube-api-access-twhs8\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.508146 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.508123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.508146 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.508135 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.508260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.508170 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.508311 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.508256 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.510246 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.510224 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.516389 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.516356 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twhs8\" (UniqueName: \"kubernetes.io/projected/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kube-api-access-twhs8\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.612738 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.612648 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:26.757308 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:26.757266 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29"] Apr 21 15:51:26.760233 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:51:26.760199 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6e1b692_c388_4c12_a6b5_a2f6a21a7821.slice/crio-eab4a6197c59a68b435ab0f38ec0af82ccf89f0cfac41231b534a7de2704c091 WatchSource:0}: Error finding container eab4a6197c59a68b435ab0f38ec0af82ccf89f0cfac41231b534a7de2704c091: Status 404 returned error can't find the container with id eab4a6197c59a68b435ab0f38ec0af82ccf89f0cfac41231b534a7de2704c091 Apr 21 15:51:27.556652 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:27.556613 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" event={"ID":"b6e1b692-c388-4c12-a6b5-a2f6a21a7821","Type":"ContainerStarted","Data":"029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235"} Apr 21 15:51:27.556652 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:27.556649 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" event={"ID":"b6e1b692-c388-4c12-a6b5-a2f6a21a7821","Type":"ContainerStarted","Data":"eab4a6197c59a68b435ab0f38ec0af82ccf89f0cfac41231b534a7de2704c091"} Apr 21 15:51:27.856825 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:27.856734 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr"] Apr 21 15:51:28.561700 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:28.561664 2569 generic.go:358] "Generic (PLEG): container finished" podID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerID="029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235" exitCode=0 Apr 21 15:51:28.562096 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:28.561749 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" event={"ID":"b6e1b692-c388-4c12-a6b5-a2f6a21a7821","Type":"ContainerDied","Data":"029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235"} Apr 21 15:51:29.568872 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:29.568836 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" event={"ID":"b6e1b692-c388-4c12-a6b5-a2f6a21a7821","Type":"ContainerStarted","Data":"947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918"} Apr 21 15:51:29.568872 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:29.568874 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" event={"ID":"b6e1b692-c388-4c12-a6b5-a2f6a21a7821","Type":"ContainerStarted","Data":"c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b"} Apr 21 15:51:29.569369 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:29.568991 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:29.591696 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:29.591640 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" podStartSLOduration=3.591623648 podStartE2EDuration="3.591623648s" podCreationTimestamp="2026-04-21 15:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:51:29.589092181 +0000 UTC m=+965.161125124" watchObservedRunningTime="2026-04-21 15:51:29.591623648 +0000 UTC m=+965.163656593" Apr 21 15:51:36.613264 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:36.613217 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:36.613264 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:36.613265 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:36.614390 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:36.614324 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.48:8082/healthz\": dial tcp 10.133.0.48:8082: connect: connection refused" Apr 21 15:51:46.615329 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:46.615255 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:51:46.616565 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:51:46.616543 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:52:06.648415 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:06.648380 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:52:10.742741 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:10.742698 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" event={"ID":"e34df173-59f1-44e3-89c7-cc8dac69ea0d","Type":"ContainerStarted","Data":"9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810"} Apr 21 15:52:10.743193 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:10.742791 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" podUID="e34df173-59f1-44e3-89c7-cc8dac69ea0d" containerName="main" containerID="cri-o://9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810" gracePeriod=30 Apr 21 15:52:10.767238 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:10.767176 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" podStartSLOduration=6.04294602 podStartE2EDuration="53.767160015s" podCreationTimestamp="2026-04-21 15:51:17 +0000 UTC" firstStartedPulling="2026-04-21 15:51:22.536124336 +0000 UTC m=+958.108157257" lastFinishedPulling="2026-04-21 15:52:10.260338327 +0000 UTC m=+1005.832371252" observedRunningTime="2026-04-21 15:52:10.764169233 +0000 UTC m=+1006.336202176" watchObservedRunningTime="2026-04-21 15:52:10.767160015 +0000 UTC m=+1006.339192959" Apr 21 15:52:17.991219 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:17.991185 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:52:41.499582 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.499557 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr_e34df173-59f1-44e3-89c7-cc8dac69ea0d/main/0.log" Apr 21 15:52:41.499925 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.499896 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:52:41.527668 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.527632 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-home\") pod \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " Apr 21 15:52:41.527839 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.527684 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-model-cache\") pod \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " Apr 21 15:52:41.527839 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.527737 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdqlf\" (UniqueName: \"kubernetes.io/projected/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kube-api-access-gdqlf\") pod \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " Apr 21 15:52:41.527839 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.527830 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e34df173-59f1-44e3-89c7-cc8dac69ea0d-tls-certs\") pod \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " Apr 21 15:52:41.528012 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.527856 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kserve-provision-location\") pod \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " Apr 21 15:52:41.528012 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.527890 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-dshm\") pod \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\" (UID: \"e34df173-59f1-44e3-89c7-cc8dac69ea0d\") " Apr 21 15:52:41.528012 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.527924 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-model-cache" (OuterVolumeSpecName: "model-cache") pod "e34df173-59f1-44e3-89c7-cc8dac69ea0d" (UID: "e34df173-59f1-44e3-89c7-cc8dac69ea0d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:52:41.528012 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.527939 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-home" (OuterVolumeSpecName: "home") pod "e34df173-59f1-44e3-89c7-cc8dac69ea0d" (UID: "e34df173-59f1-44e3-89c7-cc8dac69ea0d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:52:41.528228 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.528122 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-home\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:52:41.528228 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.528140 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-model-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:52:41.530047 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.530023 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-dshm" (OuterVolumeSpecName: "dshm") pod "e34df173-59f1-44e3-89c7-cc8dac69ea0d" (UID: "e34df173-59f1-44e3-89c7-cc8dac69ea0d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:52:41.530145 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.530055 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e34df173-59f1-44e3-89c7-cc8dac69ea0d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e34df173-59f1-44e3-89c7-cc8dac69ea0d" (UID: "e34df173-59f1-44e3-89c7-cc8dac69ea0d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:52:41.530336 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.530317 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kube-api-access-gdqlf" (OuterVolumeSpecName: "kube-api-access-gdqlf") pod "e34df173-59f1-44e3-89c7-cc8dac69ea0d" (UID: "e34df173-59f1-44e3-89c7-cc8dac69ea0d"). InnerVolumeSpecName "kube-api-access-gdqlf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:52:41.584046 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.583985 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e34df173-59f1-44e3-89c7-cc8dac69ea0d" (UID: "e34df173-59f1-44e3-89c7-cc8dac69ea0d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:52:41.629300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.629218 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gdqlf\" (UniqueName: \"kubernetes.io/projected/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kube-api-access-gdqlf\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:52:41.629300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.629249 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e34df173-59f1-44e3-89c7-cc8dac69ea0d-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:52:41.629300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.629261 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:52:41.629300 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.629270 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e34df173-59f1-44e3-89c7-cc8dac69ea0d-dshm\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:52:41.872373 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.872344 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr_e34df173-59f1-44e3-89c7-cc8dac69ea0d/main/0.log" Apr 21 15:52:41.872713 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.872688 2569 generic.go:358] "Generic (PLEG): container finished" podID="e34df173-59f1-44e3-89c7-cc8dac69ea0d" containerID="9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810" exitCode=137 Apr 21 15:52:41.872797 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.872776 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" event={"ID":"e34df173-59f1-44e3-89c7-cc8dac69ea0d","Type":"ContainerDied","Data":"9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810"} Apr 21 15:52:41.872833 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.872793 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" Apr 21 15:52:41.872833 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.872819 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr" event={"ID":"e34df173-59f1-44e3-89c7-cc8dac69ea0d","Type":"ContainerDied","Data":"601e2e31df68424c949d8e38f82acbc5b2ac394cea17e39e71b2aa451346fb63"} Apr 21 15:52:41.872898 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.872836 2569 scope.go:117] "RemoveContainer" containerID="9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810" Apr 21 15:52:41.881763 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.881743 2569 scope.go:117] "RemoveContainer" containerID="3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975" Apr 21 15:52:41.901276 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.901242 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr"] Apr 21 15:52:41.903507 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.903464 2569 scope.go:117] "RemoveContainer" containerID="9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810" Apr 21 15:52:41.903830 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:52:41.903811 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810\": container with ID starting with 9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810 not found: ID does not exist" containerID="9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810" Apr 21 15:52:41.903916 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.903839 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810"} err="failed to get container status \"9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810\": rpc error: code = NotFound desc = could not find container \"9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810\": container with ID starting with 9a7e27c30dc42e718d245333c52b62f5f96230059ec1005bd6653804f5ef1810 not found: ID does not exist" Apr 21 15:52:41.903916 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.903859 2569 scope.go:117] "RemoveContainer" containerID="3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975" Apr 21 15:52:41.904115 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:52:41.904099 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975\": container with ID starting with 3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975 not found: ID does not exist" containerID="3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975" Apr 21 15:52:41.904170 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.904118 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975"} err="failed to get container status \"3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975\": rpc error: code = NotFound desc = could not find container \"3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975\": container with ID starting with 3e577b28235b402addf99ce161b0032be86306f30c8ad6f8e49390050aae8975 not found: ID does not exist" Apr 21 15:52:41.905859 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:41.905840 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-5f895444bb-sf6tr"] Apr 21 15:52:43.054537 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:52:43.054508 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e34df173-59f1-44e3-89c7-cc8dac69ea0d" path="/var/lib/kubelet/pods/e34df173-59f1-44e3-89c7-cc8dac69ea0d/volumes" Apr 21 15:54:17.747499 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:17.747455 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29"] Apr 21 15:54:17.747917 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:17.747773 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerName="main" containerID="cri-o://c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b" gracePeriod=30 Apr 21 15:54:17.747917 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:17.747824 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerName="tokenizer" containerID="cri-o://947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918" gracePeriod=30 Apr 21 15:54:17.798357 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:54:17.798321 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6e1b692_c388_4c12_a6b5_a2f6a21a7821.slice/crio-conmon-c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b.scope\": RecentStats: unable to find data in memory cache]" Apr 21 15:54:18.227423 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:18.227388 2569 generic.go:358] "Generic (PLEG): container finished" podID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerID="c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b" exitCode=0 Apr 21 15:54:18.227609 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:18.227443 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" event={"ID":"b6e1b692-c388-4c12-a6b5-a2f6a21a7821","Type":"ContainerDied","Data":"c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b"} Apr 21 15:54:18.998581 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:18.998557 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:54:19.183715 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.183680 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tls-certs\") pod \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " Apr 21 15:54:19.183901 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.183750 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-cache\") pod \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " Apr 21 15:54:19.183901 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.183776 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-tmp\") pod \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " Apr 21 15:54:19.183901 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.183804 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twhs8\" (UniqueName: \"kubernetes.io/projected/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kube-api-access-twhs8\") pod \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " Apr 21 15:54:19.183901 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.183857 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kserve-provision-location\") pod \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " Apr 21 15:54:19.184168 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.183901 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-uds\") pod \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\" (UID: \"b6e1b692-c388-4c12-a6b5-a2f6a21a7821\") " Apr 21 15:54:19.184168 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.184015 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b6e1b692-c388-4c12-a6b5-a2f6a21a7821" (UID: "b6e1b692-c388-4c12-a6b5-a2f6a21a7821"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:19.184262 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.184141 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b6e1b692-c388-4c12-a6b5-a2f6a21a7821" (UID: "b6e1b692-c388-4c12-a6b5-a2f6a21a7821"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:19.184262 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.184194 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b6e1b692-c388-4c12-a6b5-a2f6a21a7821" (UID: "b6e1b692-c388-4c12-a6b5-a2f6a21a7821"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:19.184262 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.184244 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:54:19.184419 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.184265 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-tmp\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:54:19.184667 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.184642 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b6e1b692-c388-4c12-a6b5-a2f6a21a7821" (UID: "b6e1b692-c388-4c12-a6b5-a2f6a21a7821"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:19.185909 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.185886 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b6e1b692-c388-4c12-a6b5-a2f6a21a7821" (UID: "b6e1b692-c388-4c12-a6b5-a2f6a21a7821"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:54:19.186004 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.185984 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kube-api-access-twhs8" (OuterVolumeSpecName: "kube-api-access-twhs8") pod "b6e1b692-c388-4c12-a6b5-a2f6a21a7821" (UID: "b6e1b692-c388-4c12-a6b5-a2f6a21a7821"). InnerVolumeSpecName "kube-api-access-twhs8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:54:19.233104 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.233066 2569 generic.go:358] "Generic (PLEG): container finished" podID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerID="947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918" exitCode=0 Apr 21 15:54:19.233271 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.233149 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" Apr 21 15:54:19.233271 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.233152 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" event={"ID":"b6e1b692-c388-4c12-a6b5-a2f6a21a7821","Type":"ContainerDied","Data":"947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918"} Apr 21 15:54:19.233271 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.233191 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29" event={"ID":"b6e1b692-c388-4c12-a6b5-a2f6a21a7821","Type":"ContainerDied","Data":"eab4a6197c59a68b435ab0f38ec0af82ccf89f0cfac41231b534a7de2704c091"} Apr 21 15:54:19.233271 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.233207 2569 scope.go:117] "RemoveContainer" containerID="947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918" Apr 21 15:54:19.242032 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.242016 2569 scope.go:117] "RemoveContainer" containerID="c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b" Apr 21 15:54:19.249723 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.249702 2569 scope.go:117] "RemoveContainer" containerID="029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235" Apr 21 15:54:19.255584 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.255563 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29"] Apr 21 15:54:19.257457 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.257429 2569 scope.go:117] "RemoveContainer" containerID="947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918" Apr 21 15:54:19.257835 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:54:19.257811 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918\": container with ID starting with 947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918 not found: ID does not exist" containerID="947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918" Apr 21 15:54:19.257912 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.257845 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918"} err="failed to get container status \"947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918\": rpc error: code = NotFound desc = could not find container \"947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918\": container with ID starting with 947fa808dcce57afbaa36b8f24a0e29236dbb62a7d5cd8df774dc8478f830918 not found: ID does not exist" Apr 21 15:54:19.257912 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.257864 2569 scope.go:117] "RemoveContainer" containerID="c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b" Apr 21 15:54:19.258130 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:54:19.258107 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b\": container with ID starting with c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b not found: ID does not exist" containerID="c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b" Apr 21 15:54:19.258236 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.258135 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b"} err="failed to get container status \"c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b\": rpc error: code = NotFound desc = could not find container \"c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b\": container with ID starting with c8555d9c7d6e66d0f9bf07761f76a88c1b15de8f8f6b93286f014a9c634e360b not found: ID does not exist" Apr 21 15:54:19.258236 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.258153 2569 scope.go:117] "RemoveContainer" containerID="029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235" Apr 21 15:54:19.258403 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:54:19.258386 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235\": container with ID starting with 029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235 not found: ID does not exist" containerID="029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235" Apr 21 15:54:19.258463 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.258421 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235"} err="failed to get container status \"029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235\": rpc error: code = NotFound desc = could not find container \"029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235\": container with ID starting with 029668c9e96063cdc0ed648403c87cffe2d84adc160d9c9a236d2e45b5d71235 not found: ID does not exist" Apr 21 15:54:19.259066 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.259050 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-dhl29"] Apr 21 15:54:19.284851 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.284820 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:54:19.284851 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.284847 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tokenizer-uds\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:54:19.285008 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.284858 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:54:19.285008 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:19.284866 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twhs8\" (UniqueName: \"kubernetes.io/projected/b6e1b692-c388-4c12-a6b5-a2f6a21a7821-kube-api-access-twhs8\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:54:21.055047 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:21.055014 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" path="/var/lib/kubelet/pods/b6e1b692-c388-4c12-a6b5-a2f6a21a7821/volumes" Apr 21 15:54:44.198089 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198001 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz"] Apr 21 15:54:44.198693 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198493 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e34df173-59f1-44e3-89c7-cc8dac69ea0d" containerName="storage-initializer" Apr 21 15:54:44.198693 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198512 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34df173-59f1-44e3-89c7-cc8dac69ea0d" containerName="storage-initializer" Apr 21 15:54:44.198693 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198531 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e34df173-59f1-44e3-89c7-cc8dac69ea0d" containerName="main" Apr 21 15:54:44.198693 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198542 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34df173-59f1-44e3-89c7-cc8dac69ea0d" containerName="main" Apr 21 15:54:44.198693 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198563 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerName="storage-initializer" Apr 21 15:54:44.198693 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198572 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerName="storage-initializer" Apr 21 15:54:44.198693 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198583 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerName="tokenizer" Apr 21 15:54:44.198693 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198592 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerName="tokenizer" Apr 21 15:54:44.198693 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198613 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerName="main" Apr 21 15:54:44.198693 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198621 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerName="main" Apr 21 15:54:44.199192 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198709 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerName="main" Apr 21 15:54:44.199192 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198724 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e34df173-59f1-44e3-89c7-cc8dac69ea0d" containerName="main" Apr 21 15:54:44.199192 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.198735 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6e1b692-c388-4c12-a6b5-a2f6a21a7821" containerName="tokenizer" Apr 21 15:54:44.201958 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.201927 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.205212 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.205193 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lp6xp\"" Apr 21 15:54:44.205333 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.205315 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 21 15:54:44.205472 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.205461 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 15:54:44.206161 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.206145 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 21 15:54:44.213432 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.213409 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz"] Apr 21 15:54:44.291747 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.291710 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-dshm\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.291918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.291766 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-model-cache\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.291918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.291786 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-kserve-provision-location\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.291918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.291808 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj2gf\" (UniqueName: \"kubernetes.io/projected/40169d3f-9dce-46eb-b072-bfc6f304c205-kube-api-access-xj2gf\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.291918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.291826 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40169d3f-9dce-46eb-b072-bfc6f304c205-tls-certs\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.291918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.291842 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-home\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.392240 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.392199 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-dshm\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.392435 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.392281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-model-cache\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.392435 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.392312 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-kserve-provision-location\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.392435 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.392343 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xj2gf\" (UniqueName: \"kubernetes.io/projected/40169d3f-9dce-46eb-b072-bfc6f304c205-kube-api-access-xj2gf\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.392435 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.392370 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40169d3f-9dce-46eb-b072-bfc6f304c205-tls-certs\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.392435 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.392394 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-home\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.392882 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.392854 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-model-cache\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.392882 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.392870 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-kserve-provision-location\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.393013 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.392952 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-home\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.394683 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.394655 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-dshm\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.394924 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.394904 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40169d3f-9dce-46eb-b072-bfc6f304c205-tls-certs\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.401059 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.401038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj2gf\" (UniqueName: \"kubernetes.io/projected/40169d3f-9dce-46eb-b072-bfc6f304c205-kube-api-access-xj2gf\") pod \"stop-feature-test-kserve-7b655f99d9-gkwkz\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.512528 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.512402 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:44.515011 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.514987 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j"] Apr 21 15:54:44.519856 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.519835 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.522781 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.522761 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-nzp4g\"" Apr 21 15:54:44.538709 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.538681 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j"] Apr 21 15:54:44.594087 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.594056 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.594260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.594100 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.594260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.594126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4rd7\" (UniqueName: \"kubernetes.io/projected/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kube-api-access-h4rd7\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.594260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.594204 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.594260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.594240 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.594260 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.594258 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.645669 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.645646 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz"] Apr 21 15:54:44.648009 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:54:44.647979 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40169d3f_9dce_46eb_b072_bfc6f304c205.slice/crio-0c12343a6bdc22633269987abbf645405f3b7603a28054c15e026ad5a070e9c3 WatchSource:0}: Error finding container 0c12343a6bdc22633269987abbf645405f3b7603a28054c15e026ad5a070e9c3: Status 404 returned error can't find the container with id 0c12343a6bdc22633269987abbf645405f3b7603a28054c15e026ad5a070e9c3 Apr 21 15:54:44.694791 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.694756 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.694957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.694816 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4rd7\" (UniqueName: \"kubernetes.io/projected/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kube-api-access-h4rd7\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.694957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.694856 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.694957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.694887 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.694957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.694911 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.695179 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.695004 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.695320 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.695294 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.695392 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.695343 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.695444 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.695421 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.695768 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.695742 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.697747 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.697724 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.704163 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.704139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4rd7\" (UniqueName: \"kubernetes.io/projected/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kube-api-access-h4rd7\") pod \"stop-feature-test-kserve-router-scheduler-7bc6674995-l844j\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.843963 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.843853 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:44.982136 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:44.982105 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j"] Apr 21 15:54:44.983667 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:54:44.983628 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f70c15a_571c_43e0_8faf_a6f48cbbf12b.slice/crio-ac92816b14238efcec80371d4cefced0aeeb8ab551594766446cd77698f2feb7 WatchSource:0}: Error finding container ac92816b14238efcec80371d4cefced0aeeb8ab551594766446cd77698f2feb7: Status 404 returned error can't find the container with id ac92816b14238efcec80371d4cefced0aeeb8ab551594766446cd77698f2feb7 Apr 21 15:54:45.333197 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:45.333158 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" event={"ID":"8f70c15a-571c-43e0-8faf-a6f48cbbf12b","Type":"ContainerStarted","Data":"2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d"} Apr 21 15:54:45.333197 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:45.333199 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" event={"ID":"8f70c15a-571c-43e0-8faf-a6f48cbbf12b","Type":"ContainerStarted","Data":"ac92816b14238efcec80371d4cefced0aeeb8ab551594766446cd77698f2feb7"} Apr 21 15:54:45.334610 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:45.334585 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" event={"ID":"40169d3f-9dce-46eb-b072-bfc6f304c205","Type":"ContainerStarted","Data":"41d64f8ea2ea4bf040b167dc987f7acbfdf81d5c18bfb057db5adf919d17e750"} Apr 21 15:54:45.334610 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:45.334612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" event={"ID":"40169d3f-9dce-46eb-b072-bfc6f304c205","Type":"ContainerStarted","Data":"0c12343a6bdc22633269987abbf645405f3b7603a28054c15e026ad5a070e9c3"} Apr 21 15:54:46.341961 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:46.341922 2569 generic.go:358] "Generic (PLEG): container finished" podID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerID="2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d" exitCode=0 Apr 21 15:54:46.342343 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:46.341999 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" event={"ID":"8f70c15a-571c-43e0-8faf-a6f48cbbf12b","Type":"ContainerDied","Data":"2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d"} Apr 21 15:54:47.349001 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:47.348963 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" event={"ID":"8f70c15a-571c-43e0-8faf-a6f48cbbf12b","Type":"ContainerStarted","Data":"7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7"} Apr 21 15:54:47.349001 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:47.349002 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" event={"ID":"8f70c15a-571c-43e0-8faf-a6f48cbbf12b","Type":"ContainerStarted","Data":"f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223"} Apr 21 15:54:47.349454 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:47.349101 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:47.380411 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:47.380358 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" podStartSLOduration=3.380341884 podStartE2EDuration="3.380341884s" podCreationTimestamp="2026-04-21 15:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:54:47.377378822 +0000 UTC m=+1162.949411765" watchObservedRunningTime="2026-04-21 15:54:47.380341884 +0000 UTC m=+1162.952374825" Apr 21 15:54:49.358085 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:49.358051 2569 generic.go:358] "Generic (PLEG): container finished" podID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerID="41d64f8ea2ea4bf040b167dc987f7acbfdf81d5c18bfb057db5adf919d17e750" exitCode=0 Apr 21 15:54:49.358474 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:49.358115 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" event={"ID":"40169d3f-9dce-46eb-b072-bfc6f304c205","Type":"ContainerDied","Data":"41d64f8ea2ea4bf040b167dc987f7acbfdf81d5c18bfb057db5adf919d17e750"} Apr 21 15:54:50.365386 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:50.365347 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" event={"ID":"40169d3f-9dce-46eb-b072-bfc6f304c205","Type":"ContainerStarted","Data":"b03c447cd8a6533c0adac7f2a71b359aa00eb001c4b9d7232e1e61caf276160d"} Apr 21 15:54:50.388346 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:50.388281 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" podStartSLOduration=6.388261449 podStartE2EDuration="6.388261449s" podCreationTimestamp="2026-04-21 15:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:54:50.387162666 +0000 UTC m=+1165.959195645" watchObservedRunningTime="2026-04-21 15:54:50.388261449 +0000 UTC m=+1165.960294391" Apr 21 15:54:54.512717 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:54.512673 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:54.512717 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:54.512725 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:54:54.514211 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:54.514177 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 21 15:54:54.844918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:54.844816 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:54.844918 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:54.844869 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:54.847733 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:54.847703 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:54:55.387595 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:54:55.387563 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:55:04.512921 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:55:04.512873 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 21 15:55:14.513604 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:55:14.513555 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 21 15:55:17.395604 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:55:17.395574 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:55:24.513590 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:55:24.513544 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 21 15:55:25.024634 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:55:25.024602 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:55:25.024634 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:55:25.024605 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 15:55:34.512960 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:55:34.512915 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 21 15:55:44.513229 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:55:44.513184 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 21 15:55:54.512935 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:55:54.512887 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 21 15:56:04.513404 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:04.513358 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 21 15:56:14.513666 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:14.513572 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 21 15:56:24.522808 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:24.522770 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:56:24.530863 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:24.530833 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:56:25.695371 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:25.695336 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz"] Apr 21 15:56:25.703813 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:25.703778 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j"] Apr 21 15:56:25.704204 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:25.704173 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" podUID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerName="main" containerID="cri-o://f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223" gracePeriod=30 Apr 21 15:56:25.704321 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:25.704229 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" podUID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerName="tokenizer" containerID="cri-o://7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7" gracePeriod=30 Apr 21 15:56:25.746312 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:25.746271 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" containerID="cri-o://b03c447cd8a6533c0adac7f2a71b359aa00eb001c4b9d7232e1e61caf276160d" gracePeriod=30 Apr 21 15:56:26.753227 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.753194 2569 generic.go:358] "Generic (PLEG): container finished" podID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerID="f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223" exitCode=0 Apr 21 15:56:26.753623 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.753272 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" event={"ID":"8f70c15a-571c-43e0-8faf-a6f48cbbf12b","Type":"ContainerDied","Data":"f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223"} Apr 21 15:56:26.968678 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.968655 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:56:26.993410 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.993380 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4rd7\" (UniqueName: \"kubernetes.io/projected/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kube-api-access-h4rd7\") pod \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " Apr 21 15:56:26.993590 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.993456 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-cache\") pod \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " Apr 21 15:56:26.993590 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.993515 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-uds\") pod \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " Apr 21 15:56:26.993590 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.993541 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tls-certs\") pod \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " Apr 21 15:56:26.993590 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.993580 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-tmp\") pod \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " Apr 21 15:56:26.993812 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.993695 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kserve-provision-location\") pod \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\" (UID: \"8f70c15a-571c-43e0-8faf-a6f48cbbf12b\") " Apr 21 15:56:26.993867 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.993806 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8f70c15a-571c-43e0-8faf-a6f48cbbf12b" (UID: "8f70c15a-571c-43e0-8faf-a6f48cbbf12b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:26.993867 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.993822 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8f70c15a-571c-43e0-8faf-a6f48cbbf12b" (UID: "8f70c15a-571c-43e0-8faf-a6f48cbbf12b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:26.994101 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.994039 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:26.994101 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.994059 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-uds\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:26.994519 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.994475 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8f70c15a-571c-43e0-8faf-a6f48cbbf12b" (UID: "8f70c15a-571c-43e0-8faf-a6f48cbbf12b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:26.994832 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.994803 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8f70c15a-571c-43e0-8faf-a6f48cbbf12b" (UID: "8f70c15a-571c-43e0-8faf-a6f48cbbf12b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:26.996117 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.996091 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kube-api-access-h4rd7" (OuterVolumeSpecName: "kube-api-access-h4rd7") pod "8f70c15a-571c-43e0-8faf-a6f48cbbf12b" (UID: "8f70c15a-571c-43e0-8faf-a6f48cbbf12b"). InnerVolumeSpecName "kube-api-access-h4rd7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:56:26.996348 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:26.996302 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8f70c15a-571c-43e0-8faf-a6f48cbbf12b" (UID: "8f70c15a-571c-43e0-8faf-a6f48cbbf12b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:56:27.094728 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.094692 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:27.094728 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.094727 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h4rd7\" (UniqueName: \"kubernetes.io/projected/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-kube-api-access-h4rd7\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:27.094942 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.094738 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:27.094942 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.094750 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f70c15a-571c-43e0-8faf-a6f48cbbf12b-tokenizer-tmp\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:27.758628 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.758591 2569 generic.go:358] "Generic (PLEG): container finished" podID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerID="7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7" exitCode=0 Apr 21 15:56:27.759120 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.758678 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" Apr 21 15:56:27.759120 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.758679 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" event={"ID":"8f70c15a-571c-43e0-8faf-a6f48cbbf12b","Type":"ContainerDied","Data":"7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7"} Apr 21 15:56:27.759120 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.758720 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j" event={"ID":"8f70c15a-571c-43e0-8faf-a6f48cbbf12b","Type":"ContainerDied","Data":"ac92816b14238efcec80371d4cefced0aeeb8ab551594766446cd77698f2feb7"} Apr 21 15:56:27.759120 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.758735 2569 scope.go:117] "RemoveContainer" containerID="7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7" Apr 21 15:56:27.767414 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.767394 2569 scope.go:117] "RemoveContainer" containerID="f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223" Apr 21 15:56:27.775822 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.775798 2569 scope.go:117] "RemoveContainer" containerID="2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d" Apr 21 15:56:27.783760 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.783734 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j"] Apr 21 15:56:27.784245 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.784219 2569 scope.go:117] "RemoveContainer" containerID="7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7" Apr 21 15:56:27.784552 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:56:27.784527 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7\": container with ID starting with 7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7 not found: ID does not exist" containerID="7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7" Apr 21 15:56:27.784626 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.784566 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7"} err="failed to get container status \"7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7\": rpc error: code = NotFound desc = could not find container \"7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7\": container with ID starting with 7b6a284a0cab1b3ddf223de358a4b2e2a71bfda263df17f590f08b4795164de7 not found: ID does not exist" Apr 21 15:56:27.784626 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.784592 2569 scope.go:117] "RemoveContainer" containerID="f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223" Apr 21 15:56:27.784861 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:56:27.784844 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223\": container with ID starting with f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223 not found: ID does not exist" containerID="f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223" Apr 21 15:56:27.784903 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.784866 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223"} err="failed to get container status \"f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223\": rpc error: code = NotFound desc = could not find container \"f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223\": container with ID starting with f296f5a767ad1b9b6c77a412570b9d4cf0ae945a57b9efdc3244251181879223 not found: ID does not exist" Apr 21 15:56:27.784903 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.784881 2569 scope.go:117] "RemoveContainer" containerID="2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d" Apr 21 15:56:27.785099 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:56:27.785083 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d\": container with ID starting with 2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d not found: ID does not exist" containerID="2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d" Apr 21 15:56:27.785153 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.785101 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d"} err="failed to get container status \"2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d\": rpc error: code = NotFound desc = could not find container \"2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d\": container with ID starting with 2d53579f93bd52cc5c520be0748faf8ce48b3d71371a07af13fa01d4abb4ef6d not found: ID does not exist" Apr 21 15:56:27.794533 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:27.794495 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc6674995-l844j"] Apr 21 15:56:28.437614 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.437556 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6bf4699d45-xvlhs"] Apr 21 15:56:28.437983 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.437965 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerName="main" Apr 21 15:56:28.438069 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.437986 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerName="main" Apr 21 15:56:28.438069 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.438004 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerName="storage-initializer" Apr 21 15:56:28.438069 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.438014 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerName="storage-initializer" Apr 21 15:56:28.438069 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.438026 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerName="tokenizer" Apr 21 15:56:28.438069 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.438035 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerName="tokenizer" Apr 21 15:56:28.438342 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.438127 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerName="main" Apr 21 15:56:28.438342 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.438147 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" containerName="tokenizer" Apr 21 15:56:28.442463 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.442441 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" Apr 21 15:56:28.445251 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.445210 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-ftn4z\"" Apr 21 15:56:28.445251 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.445215 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 15:56:28.446183 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.446167 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 21 15:56:28.446243 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.446189 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 15:56:28.458594 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.458566 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6bf4699d45-xvlhs"] Apr 21 15:56:28.506562 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.506525 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdj2s\" (UniqueName: \"kubernetes.io/projected/ba9f8bad-3d2d-4707-b90e-c49c16cd94bc-kube-api-access-zdj2s\") pod \"llmisvc-controller-manager-6bf4699d45-xvlhs\" (UID: \"ba9f8bad-3d2d-4707-b90e-c49c16cd94bc\") " pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" Apr 21 15:56:28.506754 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.506605 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba9f8bad-3d2d-4707-b90e-c49c16cd94bc-cert\") pod \"llmisvc-controller-manager-6bf4699d45-xvlhs\" (UID: \"ba9f8bad-3d2d-4707-b90e-c49c16cd94bc\") " pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" Apr 21 15:56:28.607899 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.607859 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba9f8bad-3d2d-4707-b90e-c49c16cd94bc-cert\") pod \"llmisvc-controller-manager-6bf4699d45-xvlhs\" (UID: \"ba9f8bad-3d2d-4707-b90e-c49c16cd94bc\") " pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" Apr 21 15:56:28.608053 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.607935 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdj2s\" (UniqueName: \"kubernetes.io/projected/ba9f8bad-3d2d-4707-b90e-c49c16cd94bc-kube-api-access-zdj2s\") pod \"llmisvc-controller-manager-6bf4699d45-xvlhs\" (UID: \"ba9f8bad-3d2d-4707-b90e-c49c16cd94bc\") " pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" Apr 21 15:56:28.610213 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.610195 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba9f8bad-3d2d-4707-b90e-c49c16cd94bc-cert\") pod \"llmisvc-controller-manager-6bf4699d45-xvlhs\" (UID: \"ba9f8bad-3d2d-4707-b90e-c49c16cd94bc\") " pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" Apr 21 15:56:28.619757 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.619733 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdj2s\" (UniqueName: \"kubernetes.io/projected/ba9f8bad-3d2d-4707-b90e-c49c16cd94bc-kube-api-access-zdj2s\") pod \"llmisvc-controller-manager-6bf4699d45-xvlhs\" (UID: \"ba9f8bad-3d2d-4707-b90e-c49c16cd94bc\") " pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" Apr 21 15:56:28.752651 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.752556 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" Apr 21 15:56:28.886198 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.886172 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6bf4699d45-xvlhs"] Apr 21 15:56:28.887915 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:56:28.887890 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podba9f8bad_3d2d_4707_b90e_c49c16cd94bc.slice/crio-15b00ee487091fa68406fbb5c8fe755a0afedbde8cc4dbd39eaac666a93973cf WatchSource:0}: Error finding container 15b00ee487091fa68406fbb5c8fe755a0afedbde8cc4dbd39eaac666a93973cf: Status 404 returned error can't find the container with id 15b00ee487091fa68406fbb5c8fe755a0afedbde8cc4dbd39eaac666a93973cf Apr 21 15:56:28.889591 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:28.889572 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:56:29.059950 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:29.059867 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f70c15a-571c-43e0-8faf-a6f48cbbf12b" path="/var/lib/kubelet/pods/8f70c15a-571c-43e0-8faf-a6f48cbbf12b/volumes" Apr 21 15:56:29.768246 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:29.768213 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" event={"ID":"ba9f8bad-3d2d-4707-b90e-c49c16cd94bc","Type":"ContainerStarted","Data":"15b00ee487091fa68406fbb5c8fe755a0afedbde8cc4dbd39eaac666a93973cf"} Apr 21 15:56:32.782822 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:32.782779 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" event={"ID":"ba9f8bad-3d2d-4707-b90e-c49c16cd94bc","Type":"ContainerStarted","Data":"1301a3c3b3c9c8b9ae3049199c3609c049a701d21cbc0b0387b2b798fee6e011"} Apr 21 15:56:32.783253 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:32.782930 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" Apr 21 15:56:32.802358 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:32.802310 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" podStartSLOduration=1.678760049 podStartE2EDuration="4.802295569s" podCreationTimestamp="2026-04-21 15:56:28 +0000 UTC" firstStartedPulling="2026-04-21 15:56:28.889729236 +0000 UTC m=+1264.461762158" lastFinishedPulling="2026-04-21 15:56:32.013264739 +0000 UTC m=+1267.585297678" observedRunningTime="2026-04-21 15:56:32.800500406 +0000 UTC m=+1268.372533341" watchObservedRunningTime="2026-04-21 15:56:32.802295569 +0000 UTC m=+1268.374328506" Apr 21 15:56:55.878644 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:55.878618 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-7b655f99d9-gkwkz_40169d3f-9dce-46eb-b072-bfc6f304c205/main/0.log" Apr 21 15:56:55.879069 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:55.878992 2569 generic.go:358] "Generic (PLEG): container finished" podID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerID="b03c447cd8a6533c0adac7f2a71b359aa00eb001c4b9d7232e1e61caf276160d" exitCode=137 Apr 21 15:56:55.879118 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:55.879071 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" event={"ID":"40169d3f-9dce-46eb-b072-bfc6f304c205","Type":"ContainerDied","Data":"b03c447cd8a6533c0adac7f2a71b359aa00eb001c4b9d7232e1e61caf276160d"} Apr 21 15:56:55.998602 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:55.998568 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-7b655f99d9-gkwkz_40169d3f-9dce-46eb-b072-bfc6f304c205/main/0.log" Apr 21 15:56:55.998957 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:55.998939 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:56:56.148222 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.148128 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-model-cache\") pod \"40169d3f-9dce-46eb-b072-bfc6f304c205\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " Apr 21 15:56:56.148222 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.148175 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-home\") pod \"40169d3f-9dce-46eb-b072-bfc6f304c205\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " Apr 21 15:56:56.148222 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.148207 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-dshm\") pod \"40169d3f-9dce-46eb-b072-bfc6f304c205\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " Apr 21 15:56:56.148552 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.148241 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40169d3f-9dce-46eb-b072-bfc6f304c205-tls-certs\") pod \"40169d3f-9dce-46eb-b072-bfc6f304c205\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " Apr 21 15:56:56.148552 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.148259 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-kserve-provision-location\") pod \"40169d3f-9dce-46eb-b072-bfc6f304c205\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " Apr 21 15:56:56.148552 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.148284 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj2gf\" (UniqueName: \"kubernetes.io/projected/40169d3f-9dce-46eb-b072-bfc6f304c205-kube-api-access-xj2gf\") pod \"40169d3f-9dce-46eb-b072-bfc6f304c205\" (UID: \"40169d3f-9dce-46eb-b072-bfc6f304c205\") " Apr 21 15:56:56.148552 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.148434 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-model-cache" (OuterVolumeSpecName: "model-cache") pod "40169d3f-9dce-46eb-b072-bfc6f304c205" (UID: "40169d3f-9dce-46eb-b072-bfc6f304c205"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:56.148732 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.148700 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-home" (OuterVolumeSpecName: "home") pod "40169d3f-9dce-46eb-b072-bfc6f304c205" (UID: "40169d3f-9dce-46eb-b072-bfc6f304c205"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:56.150524 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.150491 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40169d3f-9dce-46eb-b072-bfc6f304c205-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "40169d3f-9dce-46eb-b072-bfc6f304c205" (UID: "40169d3f-9dce-46eb-b072-bfc6f304c205"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:56:56.150636 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.150580 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40169d3f-9dce-46eb-b072-bfc6f304c205-kube-api-access-xj2gf" (OuterVolumeSpecName: "kube-api-access-xj2gf") pod "40169d3f-9dce-46eb-b072-bfc6f304c205" (UID: "40169d3f-9dce-46eb-b072-bfc6f304c205"). InnerVolumeSpecName "kube-api-access-xj2gf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:56:56.150757 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.150743 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-dshm" (OuterVolumeSpecName: "dshm") pod "40169d3f-9dce-46eb-b072-bfc6f304c205" (UID: "40169d3f-9dce-46eb-b072-bfc6f304c205"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:56.205606 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.205559 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "40169d3f-9dce-46eb-b072-bfc6f304c205" (UID: "40169d3f-9dce-46eb-b072-bfc6f304c205"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:56.249211 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.249174 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xj2gf\" (UniqueName: \"kubernetes.io/projected/40169d3f-9dce-46eb-b072-bfc6f304c205-kube-api-access-xj2gf\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:56.249211 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.249208 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-model-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:56.249211 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.249218 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-home\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:56.249439 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.249226 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-dshm\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:56.249439 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.249234 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40169d3f-9dce-46eb-b072-bfc6f304c205-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:56.249439 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.249243 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40169d3f-9dce-46eb-b072-bfc6f304c205-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:56:56.572079 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.572050 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49"] Apr 21 15:56:56.572377 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.572366 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" Apr 21 15:56:56.572427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.572379 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" Apr 21 15:56:56.572427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.572391 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="storage-initializer" Apr 21 15:56:56.572427 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.572396 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="storage-initializer" Apr 21 15:56:56.572541 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.572459 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" containerName="main" Apr 21 15:56:56.577048 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.577024 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.580415 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.580392 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 21 15:56:56.588601 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.588571 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49"] Apr 21 15:56:56.753808 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.753766 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m98z\" (UniqueName: \"kubernetes.io/projected/f5a5efab-351c-4290-8928-a3397d2f90f1-kube-api-access-6m98z\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.753808 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.753811 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a5efab-351c-4290-8928-a3397d2f90f1-tls-certs\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.754087 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.753891 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-dshm\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.754087 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.753945 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-kserve-provision-location\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.754087 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.754007 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-model-cache\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.754215 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.754093 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-home\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.855274 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.855198 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-kserve-provision-location\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.855274 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.855257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-model-cache\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.855495 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.855312 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-home\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.855495 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.855358 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6m98z\" (UniqueName: \"kubernetes.io/projected/f5a5efab-351c-4290-8928-a3397d2f90f1-kube-api-access-6m98z\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.855495 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.855385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a5efab-351c-4290-8928-a3397d2f90f1-tls-certs\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.855495 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.855423 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-dshm\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.855711 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.855691 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-kserve-provision-location\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.855771 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.855717 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-model-cache\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.855828 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.855763 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-home\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.857785 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.857755 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-dshm\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.858199 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.858183 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a5efab-351c-4290-8928-a3397d2f90f1-tls-certs\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.864850 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.864822 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m98z\" (UniqueName: \"kubernetes.io/projected/f5a5efab-351c-4290-8928-a3397d2f90f1-kube-api-access-6m98z\") pod \"router-with-refs-test-kserve-db6d695bf-x4s49\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.884363 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.884333 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-7b655f99d9-gkwkz_40169d3f-9dce-46eb-b072-bfc6f304c205/main/0.log" Apr 21 15:56:56.884803 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.884785 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" Apr 21 15:56:56.884894 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.884782 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz" event={"ID":"40169d3f-9dce-46eb-b072-bfc6f304c205","Type":"ContainerDied","Data":"0c12343a6bdc22633269987abbf645405f3b7603a28054c15e026ad5a070e9c3"} Apr 21 15:56:56.884958 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.884907 2569 scope.go:117] "RemoveContainer" containerID="b03c447cd8a6533c0adac7f2a71b359aa00eb001c4b9d7232e1e61caf276160d" Apr 21 15:56:56.890064 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.890037 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:56:56.906070 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.906046 2569 scope.go:117] "RemoveContainer" containerID="41d64f8ea2ea4bf040b167dc987f7acbfdf81d5c18bfb057db5adf919d17e750" Apr 21 15:56:56.909748 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.909722 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz"] Apr 21 15:56:56.915745 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:56.914469 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-gkwkz"] Apr 21 15:56:57.043093 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:57.043055 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49"] Apr 21 15:56:57.044173 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:56:57.044145 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5a5efab_351c_4290_8928_a3397d2f90f1.slice/crio-a8c51e4b563e415c8e13a4cbe8bef96a56d90a46432056dca90e6e21b1f4625d WatchSource:0}: Error finding container a8c51e4b563e415c8e13a4cbe8bef96a56d90a46432056dca90e6e21b1f4625d: Status 404 returned error can't find the container with id a8c51e4b563e415c8e13a4cbe8bef96a56d90a46432056dca90e6e21b1f4625d Apr 21 15:56:57.060135 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:57.060106 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40169d3f-9dce-46eb-b072-bfc6f304c205" path="/var/lib/kubelet/pods/40169d3f-9dce-46eb-b072-bfc6f304c205/volumes" Apr 21 15:56:57.890521 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:57.890455 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" event={"ID":"f5a5efab-351c-4290-8928-a3397d2f90f1","Type":"ContainerStarted","Data":"83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086"} Apr 21 15:56:57.890521 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:56:57.890521 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" event={"ID":"f5a5efab-351c-4290-8928-a3397d2f90f1","Type":"ContainerStarted","Data":"a8c51e4b563e415c8e13a4cbe8bef96a56d90a46432056dca90e6e21b1f4625d"} Apr 21 15:57:01.908692 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:01.908660 2569 generic.go:358] "Generic (PLEG): container finished" podID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerID="83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086" exitCode=0 Apr 21 15:57:01.909060 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:01.908731 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" event={"ID":"f5a5efab-351c-4290-8928-a3397d2f90f1","Type":"ContainerDied","Data":"83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086"} Apr 21 15:57:02.914755 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:02.914709 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" event={"ID":"f5a5efab-351c-4290-8928-a3397d2f90f1","Type":"ContainerStarted","Data":"2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f"} Apr 21 15:57:02.939975 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:02.939906 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" podStartSLOduration=6.939890891 podStartE2EDuration="6.939890891s" podCreationTimestamp="2026-04-21 15:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:57:02.937089195 +0000 UTC m=+1298.509122137" watchObservedRunningTime="2026-04-21 15:57:02.939890891 +0000 UTC m=+1298.511923834" Apr 21 15:57:03.788985 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:03.788954 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6bf4699d45-xvlhs" Apr 21 15:57:06.890847 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:06.890814 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:57:06.891262 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:06.890973 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:57:06.892442 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:06.892415 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 21 15:57:16.890897 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:16.890834 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 21 15:57:26.890686 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:26.890637 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 21 15:57:36.891063 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:36.891022 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 21 15:57:46.891113 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:46.891054 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 21 15:57:56.891562 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:57:56.891514 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 21 15:58:06.261122 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.261077 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg"] Apr 21 15:58:06.265752 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.265726 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.268296 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.268273 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 21 15:58:06.268409 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.268319 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-wplnj\"" Apr 21 15:58:06.274594 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.274563 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg"] Apr 21 15:58:06.342847 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.342813 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-model-cache\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.343035 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.342856 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.343035 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.342878 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-home\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.343035 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.343003 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-dshm\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.343152 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.343066 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96160468-5588-44bd-9c1f-cd17ce9cef6a-tls-certs\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.343152 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.343122 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-998vf\" (UniqueName: \"kubernetes.io/projected/96160468-5588-44bd-9c1f-cd17ce9cef6a-kube-api-access-998vf\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.443627 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.443591 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-model-cache\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.443627 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.443638 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.443879 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.443664 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-home\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.443879 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.443713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-dshm\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.443879 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.443747 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96160468-5588-44bd-9c1f-cd17ce9cef6a-tls-certs\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.443879 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.443798 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-998vf\" (UniqueName: \"kubernetes.io/projected/96160468-5588-44bd-9c1f-cd17ce9cef6a-kube-api-access-998vf\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.444069 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.444014 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-model-cache\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.444069 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.444029 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-home\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.444169 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.444091 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.446121 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.446101 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-dshm\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.446535 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.446512 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96160468-5588-44bd-9c1f-cd17ce9cef6a-tls-certs\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.456103 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.456071 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-998vf\" (UniqueName: \"kubernetes.io/projected/96160468-5588-44bd-9c1f-cd17ce9cef6a-kube-api-access-998vf\") pod \"router-with-refs-pd-test-kserve-796b546897-pn2lg\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.502049 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.502015 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6"] Apr 21 15:58:06.505802 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.505785 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.508656 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.508634 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-g2jjn\"" Apr 21 15:58:06.518066 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.518036 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6"] Apr 21 15:58:06.544104 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.544066 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.544104 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.544108 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07f63690-2746-48eb-ba3b-51b901a91a4d-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.544338 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.544145 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp9bw\" (UniqueName: \"kubernetes.io/projected/07f63690-2746-48eb-ba3b-51b901a91a4d-kube-api-access-gp9bw\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.544338 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.544179 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.544338 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.544212 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.544338 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.544229 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.577941 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.577904 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:06.645178 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.645145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.645362 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.645190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07f63690-2746-48eb-ba3b-51b901a91a4d-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.645362 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.645244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gp9bw\" (UniqueName: \"kubernetes.io/projected/07f63690-2746-48eb-ba3b-51b901a91a4d-kube-api-access-gp9bw\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.645362 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.645283 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.645362 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.645336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.645621 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.645363 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.645621 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.645503 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.645782 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.645730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.645887 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.645822 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.646033 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.646014 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.648179 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.648156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07f63690-2746-48eb-ba3b-51b901a91a4d-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.653802 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.653725 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp9bw\" (UniqueName: \"kubernetes.io/projected/07f63690-2746-48eb-ba3b-51b901a91a4d-kube-api-access-gp9bw\") pod \"router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.723503 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.723459 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg"] Apr 21 15:58:06.726742 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:58:06.726710 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96160468_5588_44bd_9c1f_cd17ce9cef6a.slice/crio-71a7e311f54818c6eac1be0caa9c428bc0734b43093efbcb76d598ff815ae143 WatchSource:0}: Error finding container 71a7e311f54818c6eac1be0caa9c428bc0734b43093efbcb76d598ff815ae143: Status 404 returned error can't find the container with id 71a7e311f54818c6eac1be0caa9c428bc0734b43093efbcb76d598ff815ae143 Apr 21 15:58:06.817033 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.816935 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:06.891053 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.891010 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 21 15:58:06.950150 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:06.950118 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6"] Apr 21 15:58:06.951180 ip-10-0-128-232 kubenswrapper[2569]: W0421 15:58:06.951151 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07f63690_2746_48eb_ba3b_51b901a91a4d.slice/crio-f710390225d8745ac844e5bc1f23b6db2c5a265cd21aa9d4068b26918c8055a8 WatchSource:0}: Error finding container f710390225d8745ac844e5bc1f23b6db2c5a265cd21aa9d4068b26918c8055a8: Status 404 returned error can't find the container with id f710390225d8745ac844e5bc1f23b6db2c5a265cd21aa9d4068b26918c8055a8 Apr 21 15:58:07.180391 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:07.180339 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" event={"ID":"07f63690-2746-48eb-ba3b-51b901a91a4d","Type":"ContainerStarted","Data":"aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed"} Apr 21 15:58:07.180391 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:07.180394 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" event={"ID":"07f63690-2746-48eb-ba3b-51b901a91a4d","Type":"ContainerStarted","Data":"f710390225d8745ac844e5bc1f23b6db2c5a265cd21aa9d4068b26918c8055a8"} Apr 21 15:58:07.181902 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:07.181877 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" event={"ID":"96160468-5588-44bd-9c1f-cd17ce9cef6a","Type":"ContainerStarted","Data":"71a7e311f54818c6eac1be0caa9c428bc0734b43093efbcb76d598ff815ae143"} Apr 21 15:58:08.187921 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:08.187826 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" event={"ID":"96160468-5588-44bd-9c1f-cd17ce9cef6a","Type":"ContainerStarted","Data":"861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70"} Apr 21 15:58:08.188348 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:08.187942 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:08.189344 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:08.189312 2569 generic.go:358] "Generic (PLEG): container finished" podID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerID="aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed" exitCode=0 Apr 21 15:58:08.189453 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:08.189356 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" event={"ID":"07f63690-2746-48eb-ba3b-51b901a91a4d","Type":"ContainerDied","Data":"aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed"} Apr 21 15:58:09.197233 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:09.197200 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" event={"ID":"07f63690-2746-48eb-ba3b-51b901a91a4d","Type":"ContainerStarted","Data":"be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6"} Apr 21 15:58:09.197233 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:09.197236 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" event={"ID":"07f63690-2746-48eb-ba3b-51b901a91a4d","Type":"ContainerStarted","Data":"9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7"} Apr 21 15:58:09.197723 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:09.197411 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:09.199019 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:09.198997 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" event={"ID":"96160468-5588-44bd-9c1f-cd17ce9cef6a","Type":"ContainerStarted","Data":"3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f"} Apr 21 15:58:09.226832 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:09.226655 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" podStartSLOduration=3.226636043 podStartE2EDuration="3.226636043s" podCreationTimestamp="2026-04-21 15:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:58:09.22169507 +0000 UTC m=+1364.793728024" watchObservedRunningTime="2026-04-21 15:58:09.226636043 +0000 UTC m=+1364.798668987" Apr 21 15:58:13.220813 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:13.220780 2569 generic.go:358] "Generic (PLEG): container finished" podID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerID="3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f" exitCode=0 Apr 21 15:58:13.221223 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:13.220853 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" event={"ID":"96160468-5588-44bd-9c1f-cd17ce9cef6a","Type":"ContainerDied","Data":"3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f"} Apr 21 15:58:14.227837 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:14.227795 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" event={"ID":"96160468-5588-44bd-9c1f-cd17ce9cef6a","Type":"ContainerStarted","Data":"57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80"} Apr 21 15:58:14.256923 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:14.256859 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podStartSLOduration=7.395584332 podStartE2EDuration="8.256837544s" podCreationTimestamp="2026-04-21 15:58:06 +0000 UTC" firstStartedPulling="2026-04-21 15:58:06.728613817 +0000 UTC m=+1362.300646741" lastFinishedPulling="2026-04-21 15:58:07.589867033 +0000 UTC m=+1363.161899953" observedRunningTime="2026-04-21 15:58:14.252053218 +0000 UTC m=+1369.824086173" watchObservedRunningTime="2026-04-21 15:58:14.256837544 +0000 UTC m=+1369.828870502" Apr 21 15:58:16.578953 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:16.578896 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:16.578953 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:16.578955 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:16.580369 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:16.580336 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 21 15:58:16.817070 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:16.817029 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:16.817277 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:16.817088 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:16.820353 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:16.820327 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:16.890789 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:16.890685 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 21 15:58:17.241996 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:17.241967 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:26.578608 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:26.578548 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 21 15:58:26.594459 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:26.594425 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:58:26.891332 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:26.891235 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 21 15:58:36.579021 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:36.578971 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 21 15:58:36.902023 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:36.901945 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:58:36.910749 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:36.910719 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:58:38.246222 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:38.246191 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 15:58:46.578632 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:46.578565 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 21 15:58:56.579087 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:56.579033 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 21 15:58:59.742930 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:59.742894 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49"] Apr 21 15:58:59.743405 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:58:59.743268 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" containerID="cri-o://2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f" gracePeriod=30 Apr 21 15:59:06.579106 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:06.579061 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 21 15:59:16.579407 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:16.579351 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 21 15:59:26.578778 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:26.578725 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 21 15:59:30.006238 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.006172 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-db6d695bf-x4s49_f5a5efab-351c-4290-8928-a3397d2f90f1/main/0.log" Apr 21 15:59:30.006687 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.006656 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:59:30.107145 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.107110 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a5efab-351c-4290-8928-a3397d2f90f1-tls-certs\") pod \"f5a5efab-351c-4290-8928-a3397d2f90f1\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " Apr 21 15:59:30.107145 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.107148 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-home\") pod \"f5a5efab-351c-4290-8928-a3397d2f90f1\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " Apr 21 15:59:30.107422 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.107197 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-model-cache\") pod \"f5a5efab-351c-4290-8928-a3397d2f90f1\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " Apr 21 15:59:30.107422 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.107267 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-dshm\") pod \"f5a5efab-351c-4290-8928-a3397d2f90f1\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " Apr 21 15:59:30.107422 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.107313 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-kserve-provision-location\") pod \"f5a5efab-351c-4290-8928-a3397d2f90f1\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " Apr 21 15:59:30.107422 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.107334 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m98z\" (UniqueName: \"kubernetes.io/projected/f5a5efab-351c-4290-8928-a3397d2f90f1-kube-api-access-6m98z\") pod \"f5a5efab-351c-4290-8928-a3397d2f90f1\" (UID: \"f5a5efab-351c-4290-8928-a3397d2f90f1\") " Apr 21 15:59:30.107677 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.107585 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-model-cache" (OuterVolumeSpecName: "model-cache") pod "f5a5efab-351c-4290-8928-a3397d2f90f1" (UID: "f5a5efab-351c-4290-8928-a3397d2f90f1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:59:30.107677 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.107653 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-home" (OuterVolumeSpecName: "home") pod "f5a5efab-351c-4290-8928-a3397d2f90f1" (UID: "f5a5efab-351c-4290-8928-a3397d2f90f1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:59:30.109448 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.109423 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-dshm" (OuterVolumeSpecName: "dshm") pod "f5a5efab-351c-4290-8928-a3397d2f90f1" (UID: "f5a5efab-351c-4290-8928-a3397d2f90f1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:59:30.109907 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.109879 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a5efab-351c-4290-8928-a3397d2f90f1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f5a5efab-351c-4290-8928-a3397d2f90f1" (UID: "f5a5efab-351c-4290-8928-a3397d2f90f1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:59:30.109991 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.109966 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a5efab-351c-4290-8928-a3397d2f90f1-kube-api-access-6m98z" (OuterVolumeSpecName: "kube-api-access-6m98z") pod "f5a5efab-351c-4290-8928-a3397d2f90f1" (UID: "f5a5efab-351c-4290-8928-a3397d2f90f1"). InnerVolumeSpecName "kube-api-access-6m98z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:59:30.161536 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.161488 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f5a5efab-351c-4290-8928-a3397d2f90f1" (UID: "f5a5efab-351c-4290-8928-a3397d2f90f1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:59:30.208608 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.208574 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:59:30.208608 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.208602 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6m98z\" (UniqueName: \"kubernetes.io/projected/f5a5efab-351c-4290-8928-a3397d2f90f1-kube-api-access-6m98z\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:59:30.208608 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.208613 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a5efab-351c-4290-8928-a3397d2f90f1-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:59:30.208833 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.208622 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-home\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:59:30.208833 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.208632 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-model-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:59:30.208833 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.208642 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f5a5efab-351c-4290-8928-a3397d2f90f1-dshm\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 15:59:30.555373 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.555349 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-db6d695bf-x4s49_f5a5efab-351c-4290-8928-a3397d2f90f1/main/0.log" Apr 21 15:59:30.555766 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.555742 2569 generic.go:358] "Generic (PLEG): container finished" podID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerID="2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f" exitCode=137 Apr 21 15:59:30.555838 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.555814 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" Apr 21 15:59:30.555898 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.555874 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" event={"ID":"f5a5efab-351c-4290-8928-a3397d2f90f1","Type":"ContainerDied","Data":"2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f"} Apr 21 15:59:30.555951 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.555921 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49" event={"ID":"f5a5efab-351c-4290-8928-a3397d2f90f1","Type":"ContainerDied","Data":"a8c51e4b563e415c8e13a4cbe8bef96a56d90a46432056dca90e6e21b1f4625d"} Apr 21 15:59:30.555951 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.555941 2569 scope.go:117] "RemoveContainer" containerID="2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f" Apr 21 15:59:30.575798 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.575776 2569 scope.go:117] "RemoveContainer" containerID="83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086" Apr 21 15:59:30.587081 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.587060 2569 scope.go:117] "RemoveContainer" containerID="2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f" Apr 21 15:59:30.587406 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:59:30.587381 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f\": container with ID starting with 2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f not found: ID does not exist" containerID="2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f" Apr 21 15:59:30.587537 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.587415 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f"} err="failed to get container status \"2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f\": rpc error: code = NotFound desc = could not find container \"2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f\": container with ID starting with 2776361ee08df7906a03388e6f5f64b59b2a98bf8b973559ed18526cdce0de5f not found: ID does not exist" Apr 21 15:59:30.587537 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.587435 2569 scope.go:117] "RemoveContainer" containerID="83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086" Apr 21 15:59:30.587742 ip-10-0-128-232 kubenswrapper[2569]: E0421 15:59:30.587724 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086\": container with ID starting with 83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086 not found: ID does not exist" containerID="83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086" Apr 21 15:59:30.587801 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.587752 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086"} err="failed to get container status \"83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086\": rpc error: code = NotFound desc = could not find container \"83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086\": container with ID starting with 83d0d62201c2e8617e99e03a0f1c2f422968969f89ac85c5515962f7fafba086 not found: ID does not exist" Apr 21 15:59:30.588656 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.588634 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49"] Apr 21 15:59:30.598614 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:30.598586 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-db6d695bf-x4s49"] Apr 21 15:59:31.054591 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:31.054552 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" path="/var/lib/kubelet/pods/f5a5efab-351c-4290-8928-a3397d2f90f1/volumes" Apr 21 15:59:36.578947 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:36.578893 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 21 15:59:46.578761 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:46.578713 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 21 15:59:56.587947 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:56.587913 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 15:59:56.600411 ip-10-0-128-232 kubenswrapper[2569]: I0421 15:59:56.600388 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 16:00:08.820451 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:08.820410 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6"] Apr 21 16:00:08.821012 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:08.820827 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" podUID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerName="main" containerID="cri-o://9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7" gracePeriod=30 Apr 21 16:00:08.821012 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:08.820876 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" podUID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerName="tokenizer" containerID="cri-o://be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6" gracePeriod=30 Apr 21 16:00:08.832873 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:08.832841 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg"] Apr 21 16:00:08.833201 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:08.833175 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" containerID="cri-o://57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80" gracePeriod=30 Apr 21 16:00:09.715206 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:09.715168 2569 generic.go:358] "Generic (PLEG): container finished" podID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerID="9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7" exitCode=0 Apr 21 16:00:09.715381 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:09.715231 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" event={"ID":"07f63690-2746-48eb-ba3b-51b901a91a4d","Type":"ContainerDied","Data":"9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7"} Apr 21 16:00:10.177293 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.177269 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 16:00:10.262455 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.262421 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07f63690-2746-48eb-ba3b-51b901a91a4d-tls-certs\") pod \"07f63690-2746-48eb-ba3b-51b901a91a4d\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " Apr 21 16:00:10.262455 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.262459 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-cache\") pod \"07f63690-2746-48eb-ba3b-51b901a91a4d\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " Apr 21 16:00:10.262686 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.262530 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-kserve-provision-location\") pod \"07f63690-2746-48eb-ba3b-51b901a91a4d\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " Apr 21 16:00:10.262686 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.262554 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-tmp\") pod \"07f63690-2746-48eb-ba3b-51b901a91a4d\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " Apr 21 16:00:10.262686 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.262590 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-uds\") pod \"07f63690-2746-48eb-ba3b-51b901a91a4d\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " Apr 21 16:00:10.262686 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.262665 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp9bw\" (UniqueName: \"kubernetes.io/projected/07f63690-2746-48eb-ba3b-51b901a91a4d-kube-api-access-gp9bw\") pod \"07f63690-2746-48eb-ba3b-51b901a91a4d\" (UID: \"07f63690-2746-48eb-ba3b-51b901a91a4d\") " Apr 21 16:00:10.262867 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.262812 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "07f63690-2746-48eb-ba3b-51b901a91a4d" (UID: "07f63690-2746-48eb-ba3b-51b901a91a4d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:10.262925 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.262883 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "07f63690-2746-48eb-ba3b-51b901a91a4d" (UID: "07f63690-2746-48eb-ba3b-51b901a91a4d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:10.262925 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.262901 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:10.262925 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.262905 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "07f63690-2746-48eb-ba3b-51b901a91a4d" (UID: "07f63690-2746-48eb-ba3b-51b901a91a4d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:10.263172 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.263152 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "07f63690-2746-48eb-ba3b-51b901a91a4d" (UID: "07f63690-2746-48eb-ba3b-51b901a91a4d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:10.264645 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.264614 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f63690-2746-48eb-ba3b-51b901a91a4d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "07f63690-2746-48eb-ba3b-51b901a91a4d" (UID: "07f63690-2746-48eb-ba3b-51b901a91a4d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:00:10.264808 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.264791 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f63690-2746-48eb-ba3b-51b901a91a4d-kube-api-access-gp9bw" (OuterVolumeSpecName: "kube-api-access-gp9bw") pod "07f63690-2746-48eb-ba3b-51b901a91a4d" (UID: "07f63690-2746-48eb-ba3b-51b901a91a4d"). InnerVolumeSpecName "kube-api-access-gp9bw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:00:10.363714 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.363617 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07f63690-2746-48eb-ba3b-51b901a91a4d-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:10.363714 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.363657 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:10.363714 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.363669 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-tmp\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:10.363714 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.363678 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07f63690-2746-48eb-ba3b-51b901a91a4d-tokenizer-uds\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:10.363714 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.363687 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gp9bw\" (UniqueName: \"kubernetes.io/projected/07f63690-2746-48eb-ba3b-51b901a91a4d-kube-api-access-gp9bw\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:10.720546 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.720506 2569 generic.go:358] "Generic (PLEG): container finished" podID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerID="be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6" exitCode=0 Apr 21 16:00:10.720713 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.720604 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" event={"ID":"07f63690-2746-48eb-ba3b-51b901a91a4d","Type":"ContainerDied","Data":"be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6"} Apr 21 16:00:10.720713 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.720630 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" event={"ID":"07f63690-2746-48eb-ba3b-51b901a91a4d","Type":"ContainerDied","Data":"f710390225d8745ac844e5bc1f23b6db2c5a265cd21aa9d4068b26918c8055a8"} Apr 21 16:00:10.720713 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.720646 2569 scope.go:117] "RemoveContainer" containerID="be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6" Apr 21 16:00:10.720713 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.720661 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6" Apr 21 16:00:10.729727 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.729709 2569 scope.go:117] "RemoveContainer" containerID="9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7" Apr 21 16:00:10.737391 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.737369 2569 scope.go:117] "RemoveContainer" containerID="aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed" Apr 21 16:00:10.744101 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.744022 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6"] Apr 21 16:00:10.746469 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.746450 2569 scope.go:117] "RemoveContainer" containerID="be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6" Apr 21 16:00:10.746570 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.746549 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-599496697hpwl6"] Apr 21 16:00:10.746763 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:00:10.746745 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6\": container with ID starting with be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6 not found: ID does not exist" containerID="be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6" Apr 21 16:00:10.746806 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.746774 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6"} err="failed to get container status \"be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6\": rpc error: code = NotFound desc = could not find container \"be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6\": container with ID starting with be191e6e7bb8357cd80698951a3a0694ea427ccac4f201d0f972e127dcd787b6 not found: ID does not exist" Apr 21 16:00:10.746806 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.746793 2569 scope.go:117] "RemoveContainer" containerID="9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7" Apr 21 16:00:10.747052 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:00:10.747033 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7\": container with ID starting with 9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7 not found: ID does not exist" containerID="9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7" Apr 21 16:00:10.747128 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.747055 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7"} err="failed to get container status \"9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7\": rpc error: code = NotFound desc = could not find container \"9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7\": container with ID starting with 9be68a52c77544e3ec67afab798abbda1f4db3763aaa2567988d4d041864a3b7 not found: ID does not exist" Apr 21 16:00:10.747128 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.747067 2569 scope.go:117] "RemoveContainer" containerID="aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed" Apr 21 16:00:10.747310 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:00:10.747293 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed\": container with ID starting with aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed not found: ID does not exist" containerID="aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed" Apr 21 16:00:10.747352 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:10.747315 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed"} err="failed to get container status \"aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed\": rpc error: code = NotFound desc = could not find container \"aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed\": container with ID starting with aed4966e8bb1e31db05aadbc05045da8f61449d7d6f1a5431f0e18b08d615aed not found: ID does not exist" Apr 21 16:00:11.055364 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.055276 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f63690-2746-48eb-ba3b-51b901a91a4d" path="/var/lib/kubelet/pods/07f63690-2746-48eb-ba3b-51b901a91a4d/volumes" Apr 21 16:00:11.924920 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.924883 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx"] Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925217 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="storage-initializer" Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925231 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="storage-initializer" Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925250 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerName="tokenizer" Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925257 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerName="tokenizer" Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925264 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925270 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925276 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerName="main" Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925280 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerName="main" Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925296 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerName="storage-initializer" Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925302 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerName="storage-initializer" Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925365 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerName="main" Apr 21 16:00:11.925367 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925374 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5a5efab-351c-4290-8928-a3397d2f90f1" containerName="main" Apr 21 16:00:11.925993 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.925384 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="07f63690-2746-48eb-ba3b-51b901a91a4d" containerName="tokenizer" Apr 21 16:00:11.930051 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.930029 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:11.932627 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.932595 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 21 16:00:11.940790 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.940766 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx"] Apr 21 16:00:11.979683 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.979652 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb3c30c-55b8-4b3e-9c1c-32d913925335-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:11.979683 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.979695 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m6kq\" (UniqueName: \"kubernetes.io/projected/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kube-api-access-2m6kq\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:11.979892 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.979730 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:11.979892 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.979755 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:11.979892 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.979804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:11.979892 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:11.979823 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.081160 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.081117 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2m6kq\" (UniqueName: \"kubernetes.io/projected/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kube-api-access-2m6kq\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.081316 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.081171 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.081316 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.081287 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.081435 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.081360 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.081435 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.081396 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.081435 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.081429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb3c30c-55b8-4b3e-9c1c-32d913925335-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.081724 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.081540 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.081724 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.081643 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.081822 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.081756 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.083848 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.083823 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.084045 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.084029 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb3c30c-55b8-4b3e-9c1c-32d913925335-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.091064 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.091041 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m6kq\" (UniqueName: \"kubernetes.io/projected/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kube-api-access-2m6kq\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.241188 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.241103 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:12.381913 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.381876 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx"] Apr 21 16:00:12.383831 ip-10-0-128-232 kubenswrapper[2569]: W0421 16:00:12.383805 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cb3c30c_55b8_4b3e_9c1c_32d913925335.slice/crio-806fab78aaaeb2f3212a25a7e72f082aa52ae0bc393b3e9a2efa86ffd8552003 WatchSource:0}: Error finding container 806fab78aaaeb2f3212a25a7e72f082aa52ae0bc393b3e9a2efa86ffd8552003: Status 404 returned error can't find the container with id 806fab78aaaeb2f3212a25a7e72f082aa52ae0bc393b3e9a2efa86ffd8552003 Apr 21 16:00:12.734150 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.734112 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" event={"ID":"6cb3c30c-55b8-4b3e-9c1c-32d913925335","Type":"ContainerStarted","Data":"c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024"} Apr 21 16:00:12.734150 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:12.734157 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" event={"ID":"6cb3c30c-55b8-4b3e-9c1c-32d913925335","Type":"ContainerStarted","Data":"806fab78aaaeb2f3212a25a7e72f082aa52ae0bc393b3e9a2efa86ffd8552003"} Apr 21 16:00:16.751688 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:16.751648 2569 generic.go:358] "Generic (PLEG): container finished" podID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerID="c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024" exitCode=0 Apr 21 16:00:16.752045 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:16.751720 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" event={"ID":"6cb3c30c-55b8-4b3e-9c1c-32d913925335","Type":"ContainerDied","Data":"c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024"} Apr 21 16:00:17.761163 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:17.761123 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" event={"ID":"6cb3c30c-55b8-4b3e-9c1c-32d913925335","Type":"ContainerStarted","Data":"eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a"} Apr 21 16:00:17.788783 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:17.788719 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" podStartSLOduration=6.788699826 podStartE2EDuration="6.788699826s" podCreationTimestamp="2026-04-21 16:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:00:17.785387911 +0000 UTC m=+1493.357420852" watchObservedRunningTime="2026-04-21 16:00:17.788699826 +0000 UTC m=+1493.360732768" Apr 21 16:00:22.241946 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:22.241906 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:22.242458 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:22.241959 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:00:22.243537 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:22.243475 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 21 16:00:25.059197 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:25.059168 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 16:00:25.060018 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:25.059994 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 16:00:32.242119 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:32.242072 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 21 16:00:38.833474 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:38.833397 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="llm-d-routing-sidecar" containerID="cri-o://861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70" gracePeriod=2 Apr 21 16:00:39.126940 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.126916 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-796b546897-pn2lg_96160468-5588-44bd-9c1f-cd17ce9cef6a/main/0.log" Apr 21 16:00:39.127665 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.127643 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 16:00:39.231177 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.231139 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-kserve-provision-location\") pod \"96160468-5588-44bd-9c1f-cd17ce9cef6a\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " Apr 21 16:00:39.231374 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.231205 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96160468-5588-44bd-9c1f-cd17ce9cef6a-tls-certs\") pod \"96160468-5588-44bd-9c1f-cd17ce9cef6a\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " Apr 21 16:00:39.231374 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.231234 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-998vf\" (UniqueName: \"kubernetes.io/projected/96160468-5588-44bd-9c1f-cd17ce9cef6a-kube-api-access-998vf\") pod \"96160468-5588-44bd-9c1f-cd17ce9cef6a\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " Apr 21 16:00:39.231374 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.231263 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-home\") pod \"96160468-5588-44bd-9c1f-cd17ce9cef6a\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " Apr 21 16:00:39.231374 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.231286 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-dshm\") pod \"96160468-5588-44bd-9c1f-cd17ce9cef6a\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " Apr 21 16:00:39.231374 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.231346 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-model-cache\") pod \"96160468-5588-44bd-9c1f-cd17ce9cef6a\" (UID: \"96160468-5588-44bd-9c1f-cd17ce9cef6a\") " Apr 21 16:00:39.231783 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.231732 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-model-cache" (OuterVolumeSpecName: "model-cache") pod "96160468-5588-44bd-9c1f-cd17ce9cef6a" (UID: "96160468-5588-44bd-9c1f-cd17ce9cef6a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:39.232049 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.231770 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-home" (OuterVolumeSpecName: "home") pod "96160468-5588-44bd-9c1f-cd17ce9cef6a" (UID: "96160468-5588-44bd-9c1f-cd17ce9cef6a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:39.232378 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.232355 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-home\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.232461 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.232383 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-model-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.233625 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.233596 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96160468-5588-44bd-9c1f-cd17ce9cef6a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "96160468-5588-44bd-9c1f-cd17ce9cef6a" (UID: "96160468-5588-44bd-9c1f-cd17ce9cef6a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:00:39.234009 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.233988 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-dshm" (OuterVolumeSpecName: "dshm") pod "96160468-5588-44bd-9c1f-cd17ce9cef6a" (UID: "96160468-5588-44bd-9c1f-cd17ce9cef6a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:39.234009 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.233992 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96160468-5588-44bd-9c1f-cd17ce9cef6a-kube-api-access-998vf" (OuterVolumeSpecName: "kube-api-access-998vf") pod "96160468-5588-44bd-9c1f-cd17ce9cef6a" (UID: "96160468-5588-44bd-9c1f-cd17ce9cef6a"). InnerVolumeSpecName "kube-api-access-998vf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:00:39.270171 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.270109 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "96160468-5588-44bd-9c1f-cd17ce9cef6a" (UID: "96160468-5588-44bd-9c1f-cd17ce9cef6a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:39.333021 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.332982 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96160468-5588-44bd-9c1f-cd17ce9cef6a-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.333021 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.333011 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-998vf\" (UniqueName: \"kubernetes.io/projected/96160468-5588-44bd-9c1f-cd17ce9cef6a-kube-api-access-998vf\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.333021 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.333023 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-dshm\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.333259 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.333033 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96160468-5588-44bd-9c1f-cd17ce9cef6a-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.849734 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.849698 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-796b546897-pn2lg_96160468-5588-44bd-9c1f-cd17ce9cef6a/main/0.log" Apr 21 16:00:39.850376 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.850346 2569 generic.go:358] "Generic (PLEG): container finished" podID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerID="57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80" exitCode=137 Apr 21 16:00:39.850493 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.850378 2569 generic.go:358] "Generic (PLEG): container finished" podID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerID="861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70" exitCode=0 Apr 21 16:00:39.850493 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.850417 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" Apr 21 16:00:39.850493 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.850425 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" event={"ID":"96160468-5588-44bd-9c1f-cd17ce9cef6a","Type":"ContainerDied","Data":"57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80"} Apr 21 16:00:39.850493 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.850464 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" event={"ID":"96160468-5588-44bd-9c1f-cd17ce9cef6a","Type":"ContainerDied","Data":"861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70"} Apr 21 16:00:39.850714 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.850496 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg" event={"ID":"96160468-5588-44bd-9c1f-cd17ce9cef6a","Type":"ContainerDied","Data":"71a7e311f54818c6eac1be0caa9c428bc0734b43093efbcb76d598ff815ae143"} Apr 21 16:00:39.850714 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.850512 2569 scope.go:117] "RemoveContainer" containerID="57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80" Apr 21 16:00:39.873646 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.873503 2569 scope.go:117] "RemoveContainer" containerID="3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f" Apr 21 16:00:39.876123 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.876091 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg"] Apr 21 16:00:39.879016 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.878994 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-796b546897-pn2lg"] Apr 21 16:00:39.918822 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.918794 2569 scope.go:117] "RemoveContainer" containerID="861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70" Apr 21 16:00:39.930729 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.930696 2569 scope.go:117] "RemoveContainer" containerID="57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80" Apr 21 16:00:39.931259 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:00:39.931236 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80\": container with ID starting with 57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80 not found: ID does not exist" containerID="57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80" Apr 21 16:00:39.931358 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.931268 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80"} err="failed to get container status \"57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80\": rpc error: code = NotFound desc = could not find container \"57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80\": container with ID starting with 57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80 not found: ID does not exist" Apr 21 16:00:39.931358 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.931295 2569 scope.go:117] "RemoveContainer" containerID="3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f" Apr 21 16:00:39.931740 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:00:39.931713 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f\": container with ID starting with 3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f not found: ID does not exist" containerID="3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f" Apr 21 16:00:39.931859 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.931749 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f"} err="failed to get container status \"3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f\": rpc error: code = NotFound desc = could not find container \"3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f\": container with ID starting with 3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f not found: ID does not exist" Apr 21 16:00:39.931859 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.931773 2569 scope.go:117] "RemoveContainer" containerID="861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70" Apr 21 16:00:39.932063 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:00:39.932043 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70\": container with ID starting with 861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70 not found: ID does not exist" containerID="861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70" Apr 21 16:00:39.932147 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.932068 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70"} err="failed to get container status \"861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70\": rpc error: code = NotFound desc = could not find container \"861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70\": container with ID starting with 861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70 not found: ID does not exist" Apr 21 16:00:39.932147 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.932084 2569 scope.go:117] "RemoveContainer" containerID="57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80" Apr 21 16:00:39.932302 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.932281 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80"} err="failed to get container status \"57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80\": rpc error: code = NotFound desc = could not find container \"57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80\": container with ID starting with 57b3fa7820e98cf07eecfa55033f2cb036317ef07f8c4240ad4c240def3d1c80 not found: ID does not exist" Apr 21 16:00:39.932302 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.932300 2569 scope.go:117] "RemoveContainer" containerID="3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f" Apr 21 16:00:39.932582 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.932560 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f"} err="failed to get container status \"3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f\": rpc error: code = NotFound desc = could not find container \"3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f\": container with ID starting with 3f557f3028387b6101e12019d7cfc28818177c4d280b7b6e15ead074553ed64f not found: ID does not exist" Apr 21 16:00:39.932656 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.932594 2569 scope.go:117] "RemoveContainer" containerID="861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70" Apr 21 16:00:39.932812 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:39.932793 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70"} err="failed to get container status \"861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70\": rpc error: code = NotFound desc = could not find container \"861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70\": container with ID starting with 861dc0e99c7c70c980f9b8e40b6d7f1c4cd8fcbdc0ebd490c409784fb9ee8f70 not found: ID does not exist" Apr 21 16:00:41.055051 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:41.055008 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" path="/var/lib/kubelet/pods/96160468-5588-44bd-9c1f-cd17ce9cef6a/volumes" Apr 21 16:00:42.242143 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:42.242095 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 21 16:00:52.242633 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:00:52.242535 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 21 16:01:02.241569 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:01:02.241517 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 21 16:01:12.242531 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:01:12.242455 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 21 16:01:22.242209 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:01:22.242153 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 21 16:01:32.241753 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:01:32.241700 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 21 16:01:42.242093 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:01:42.242040 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8000/health\": dial tcp 10.133.0.55:8000: connect: connection refused" Apr 21 16:01:52.253546 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:01:52.253515 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:01:52.261992 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:01:52.261964 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:02:03.396470 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.396432 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk"] Apr 21 16:02:03.396858 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.396776 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="llm-d-routing-sidecar" Apr 21 16:02:03.396858 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.396788 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="llm-d-routing-sidecar" Apr 21 16:02:03.396858 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.396806 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" Apr 21 16:02:03.396858 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.396812 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" Apr 21 16:02:03.396858 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.396819 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="storage-initializer" Apr 21 16:02:03.396858 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.396824 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="storage-initializer" Apr 21 16:02:03.397050 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.396877 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="llm-d-routing-sidecar" Apr 21 16:02:03.397050 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.396888 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="96160468-5588-44bd-9c1f-cd17ce9cef6a" containerName="main" Apr 21 16:02:03.400356 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.400334 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.402832 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.402812 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 21 16:02:03.402946 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.402812 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-4pd4w\"" Apr 21 16:02:03.411272 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.411247 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk"] Apr 21 16:02:03.480543 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.480504 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-model-cache\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.480717 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.480565 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-dshm\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.480717 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.480587 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.480717 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.480612 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrj6j\" (UniqueName: \"kubernetes.io/projected/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kube-api-access-vrj6j\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.480717 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.480630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-home\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.480855 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.480737 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.581519 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.581454 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.581710 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.581540 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-model-cache\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.581710 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.581584 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-dshm\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.581710 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.581604 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.581710 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.581648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrj6j\" (UniqueName: \"kubernetes.io/projected/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kube-api-access-vrj6j\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.581993 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.581925 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-model-cache\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.582058 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.582027 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-home\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.582116 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.582095 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.582260 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.582241 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-home\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.583885 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.583862 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-dshm\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.584161 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.584144 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.590340 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.590318 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrj6j\" (UniqueName: \"kubernetes.io/projected/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kube-api-access-vrj6j\") pod \"custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.644677 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.644639 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq"] Apr 21 16:02:03.648585 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.648518 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.651101 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.651079 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-lpb9p\"" Apr 21 16:02:03.659159 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.659133 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq"] Apr 21 16:02:03.711307 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.711268 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:03.783552 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.783520 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tf5q\" (UniqueName: \"kubernetes.io/projected/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kube-api-access-7tf5q\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.783698 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.783584 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.783698 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.783610 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.783698 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.783626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.783698 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.783655 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.783698 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.783680 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.841311 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.841277 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk"] Apr 21 16:02:03.844333 ip-10-0-128-232 kubenswrapper[2569]: W0421 16:02:03.844305 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7f5d183_76fb_4e4a_aed3_cdbb755398c5.slice/crio-2da6750226ea400f76e62e81deec5f4b04a1305acb0c0c2e1bc1286d34bed750 WatchSource:0}: Error finding container 2da6750226ea400f76e62e81deec5f4b04a1305acb0c0c2e1bc1286d34bed750: Status 404 returned error can't find the container with id 2da6750226ea400f76e62e81deec5f4b04a1305acb0c0c2e1bc1286d34bed750 Apr 21 16:02:03.846135 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.846114 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:02:03.884175 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.884148 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.884297 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.884198 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tf5q\" (UniqueName: \"kubernetes.io/projected/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kube-api-access-7tf5q\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.884297 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.884266 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.884419 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.884377 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.884419 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.884405 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.884582 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.884460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.884896 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.884874 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.884986 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.884925 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.884986 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.884950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.885099 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.885003 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.886659 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.886638 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.892002 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.891975 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tf5q\" (UniqueName: \"kubernetes.io/projected/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kube-api-access-7tf5q\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:03.958304 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:03.958268 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:04.126149 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:04.125928 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq"] Apr 21 16:02:04.189508 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:04.189455 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" event={"ID":"c7f5d183-76fb-4e4a-aed3-cdbb755398c5","Type":"ContainerStarted","Data":"98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153"} Apr 21 16:02:04.189639 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:04.189518 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" event={"ID":"c7f5d183-76fb-4e4a-aed3-cdbb755398c5","Type":"ContainerStarted","Data":"2da6750226ea400f76e62e81deec5f4b04a1305acb0c0c2e1bc1286d34bed750"} Apr 21 16:02:04.189639 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:04.189584 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:04.190987 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:04.190961 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" event={"ID":"cee1b71f-085f-4e69-bbef-816ce7e8fd8e","Type":"ContainerStarted","Data":"34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1"} Apr 21 16:02:04.191084 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:04.190992 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" event={"ID":"cee1b71f-085f-4e69-bbef-816ce7e8fd8e","Type":"ContainerStarted","Data":"c2b2cb0a4bbef3641d0e283ce3c9a42f31d1f6ee23f5f22cabcf79aba4dd971e"} Apr 21 16:02:05.196851 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:05.196803 2569 generic.go:358] "Generic (PLEG): container finished" podID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerID="34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1" exitCode=0 Apr 21 16:02:05.197576 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:05.196843 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" event={"ID":"cee1b71f-085f-4e69-bbef-816ce7e8fd8e","Type":"ContainerDied","Data":"34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1"} Apr 21 16:02:05.199204 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:05.199172 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" event={"ID":"c7f5d183-76fb-4e4a-aed3-cdbb755398c5","Type":"ContainerStarted","Data":"cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9"} Apr 21 16:02:05.414590 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:05.414551 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx"] Apr 21 16:02:05.415002 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:05.414921 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" containerID="cri-o://eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a" gracePeriod=30 Apr 21 16:02:06.205598 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:06.205560 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" event={"ID":"cee1b71f-085f-4e69-bbef-816ce7e8fd8e","Type":"ContainerStarted","Data":"3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96"} Apr 21 16:02:06.205598 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:06.205595 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" event={"ID":"cee1b71f-085f-4e69-bbef-816ce7e8fd8e","Type":"ContainerStarted","Data":"df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1"} Apr 21 16:02:06.206325 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:06.205885 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:06.227853 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:06.227800 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" podStartSLOduration=3.227786151 podStartE2EDuration="3.227786151s" podCreationTimestamp="2026-04-21 16:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:02:06.225184433 +0000 UTC m=+1601.797217389" watchObservedRunningTime="2026-04-21 16:02:06.227786151 +0000 UTC m=+1601.799819098" Apr 21 16:02:09.224116 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:09.224076 2569 generic.go:358] "Generic (PLEG): container finished" podID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerID="cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9" exitCode=0 Apr 21 16:02:09.224504 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:09.224113 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" event={"ID":"c7f5d183-76fb-4e4a-aed3-cdbb755398c5","Type":"ContainerDied","Data":"cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9"} Apr 21 16:02:10.231825 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:10.231786 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" event={"ID":"c7f5d183-76fb-4e4a-aed3-cdbb755398c5","Type":"ContainerStarted","Data":"b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba"} Apr 21 16:02:10.257846 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:10.257760 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podStartSLOduration=7.25773942 podStartE2EDuration="7.25773942s" podCreationTimestamp="2026-04-21 16:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:02:10.254673807 +0000 UTC m=+1605.826706751" watchObservedRunningTime="2026-04-21 16:02:10.25773942 +0000 UTC m=+1605.829772366" Apr 21 16:02:13.040094 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.040054 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 21 16:02:13.046315 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.046288 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.049098 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.049076 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-pclvk\"" Apr 21 16:02:13.050115 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.050093 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 21 16:02:13.057049 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.057021 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 21 16:02:13.171712 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.171672 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.171889 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.171744 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.171889 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.171781 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f0ca46e5-975c-4918-af98-7fdb5d238330-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.171889 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.171800 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrv7r\" (UniqueName: \"kubernetes.io/projected/f0ca46e5-975c-4918-af98-7fdb5d238330-kube-api-access-qrv7r\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.171889 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.171819 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.171889 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.171865 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.272599 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.272562 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.272784 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.272620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.272784 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.272659 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f0ca46e5-975c-4918-af98-7fdb5d238330-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.272784 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.272687 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrv7r\" (UniqueName: \"kubernetes.io/projected/f0ca46e5-975c-4918-af98-7fdb5d238330-kube-api-access-qrv7r\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.272784 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.272718 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.272784 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.272744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.273046 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.272984 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.273046 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.273032 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.273162 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.273139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.274963 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.274940 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.275197 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.275181 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f0ca46e5-975c-4918-af98-7fdb5d238330-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.280949 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.280918 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrv7r\" (UniqueName: \"kubernetes.io/projected/f0ca46e5-975c-4918-af98-7fdb5d238330-kube-api-access-qrv7r\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.359816 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.359728 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:13.491211 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.491180 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 21 16:02:13.493043 ip-10-0-128-232 kubenswrapper[2569]: W0421 16:02:13.493006 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0ca46e5_975c_4918_af98_7fdb5d238330.slice/crio-09463c4539fe02f6dec0e90e8c9b846b045fe786a8c739cce2b30a3f4ca60c79 WatchSource:0}: Error finding container 09463c4539fe02f6dec0e90e8c9b846b045fe786a8c739cce2b30a3f4ca60c79: Status 404 returned error can't find the container with id 09463c4539fe02f6dec0e90e8c9b846b045fe786a8c739cce2b30a3f4ca60c79 Apr 21 16:02:13.711625 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.711592 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:13.712048 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.712001 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:13.713083 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.713035 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:02:13.735443 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.735406 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:02:13.958962 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.958919 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:13.959160 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.958979 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:13.961867 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:13.961788 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:14.250733 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:14.250627 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f0ca46e5-975c-4918-af98-7fdb5d238330","Type":"ContainerStarted","Data":"8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb"} Apr 21 16:02:14.250733 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:14.250679 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f0ca46e5-975c-4918-af98-7fdb5d238330","Type":"ContainerStarted","Data":"09463c4539fe02f6dec0e90e8c9b846b045fe786a8c739cce2b30a3f4ca60c79"} Apr 21 16:02:14.252225 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:14.252203 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:18.271711 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:18.271674 2569 generic.go:358] "Generic (PLEG): container finished" podID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerID="8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb" exitCode=0 Apr 21 16:02:18.272167 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:18.271735 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f0ca46e5-975c-4918-af98-7fdb5d238330","Type":"ContainerDied","Data":"8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb"} Apr 21 16:02:19.278267 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:19.278217 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f0ca46e5-975c-4918-af98-7fdb5d238330","Type":"ContainerStarted","Data":"207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5"} Apr 21 16:02:19.300995 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:19.300921 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.300900748 podStartE2EDuration="6.300900748s" podCreationTimestamp="2026-04-21 16:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:02:19.297503163 +0000 UTC m=+1614.869536109" watchObservedRunningTime="2026-04-21 16:02:19.300900748 +0000 UTC m=+1614.872933692" Apr 21 16:02:23.360701 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:23.360645 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:23.362319 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:23.362288 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:02:23.712687 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:23.712558 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:02:33.360640 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:33.360588 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:02:33.711979 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:33.711926 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:02:35.259411 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.259380 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:02:35.711790 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.711765 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx_6cb3c30c-55b8-4b3e-9c1c-32d913925335/main/0.log" Apr 21 16:02:35.712215 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.712198 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:02:35.784023 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.783984 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m6kq\" (UniqueName: \"kubernetes.io/projected/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kube-api-access-2m6kq\") pod \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " Apr 21 16:02:35.784023 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.784031 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-model-cache\") pod \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " Apr 21 16:02:35.784273 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.784092 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-home\") pod \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " Apr 21 16:02:35.784273 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.784109 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kserve-provision-location\") pod \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " Apr 21 16:02:35.784273 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.784159 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb3c30c-55b8-4b3e-9c1c-32d913925335-tls-certs\") pod \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " Apr 21 16:02:35.784273 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.784182 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-dshm\") pod \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\" (UID: \"6cb3c30c-55b8-4b3e-9c1c-32d913925335\") " Apr 21 16:02:35.784522 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.784360 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-model-cache" (OuterVolumeSpecName: "model-cache") pod "6cb3c30c-55b8-4b3e-9c1c-32d913925335" (UID: "6cb3c30c-55b8-4b3e-9c1c-32d913925335"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:35.784522 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.784435 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-model-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:02:35.784639 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.784524 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-home" (OuterVolumeSpecName: "home") pod "6cb3c30c-55b8-4b3e-9c1c-32d913925335" (UID: "6cb3c30c-55b8-4b3e-9c1c-32d913925335"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:35.794011 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.793974 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kube-api-access-2m6kq" (OuterVolumeSpecName: "kube-api-access-2m6kq") pod "6cb3c30c-55b8-4b3e-9c1c-32d913925335" (UID: "6cb3c30c-55b8-4b3e-9c1c-32d913925335"). InnerVolumeSpecName "kube-api-access-2m6kq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:02:35.794343 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.794311 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-dshm" (OuterVolumeSpecName: "dshm") pod "6cb3c30c-55b8-4b3e-9c1c-32d913925335" (UID: "6cb3c30c-55b8-4b3e-9c1c-32d913925335"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:35.794530 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.794509 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cb3c30c-55b8-4b3e-9c1c-32d913925335-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6cb3c30c-55b8-4b3e-9c1c-32d913925335" (UID: "6cb3c30c-55b8-4b3e-9c1c-32d913925335"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:02:35.862710 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.862615 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6cb3c30c-55b8-4b3e-9c1c-32d913925335" (UID: "6cb3c30c-55b8-4b3e-9c1c-32d913925335"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:35.885714 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.885680 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-home\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:02:35.885714 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.885709 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:02:35.885714 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.885721 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb3c30c-55b8-4b3e-9c1c-32d913925335-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:02:35.886026 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.885735 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cb3c30c-55b8-4b3e-9c1c-32d913925335-dshm\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:02:35.886026 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:35.885745 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2m6kq\" (UniqueName: \"kubernetes.io/projected/6cb3c30c-55b8-4b3e-9c1c-32d913925335-kube-api-access-2m6kq\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:02:36.352267 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.352232 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx_6cb3c30c-55b8-4b3e-9c1c-32d913925335/main/0.log" Apr 21 16:02:36.352763 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.352594 2569 generic.go:358] "Generic (PLEG): container finished" podID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerID="eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a" exitCode=137 Apr 21 16:02:36.352763 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.352660 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" event={"ID":"6cb3c30c-55b8-4b3e-9c1c-32d913925335","Type":"ContainerDied","Data":"eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a"} Apr 21 16:02:36.352763 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.352675 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" Apr 21 16:02:36.352763 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.352683 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx" event={"ID":"6cb3c30c-55b8-4b3e-9c1c-32d913925335","Type":"ContainerDied","Data":"806fab78aaaeb2f3212a25a7e72f082aa52ae0bc393b3e9a2efa86ffd8552003"} Apr 21 16:02:36.352763 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.352700 2569 scope.go:117] "RemoveContainer" containerID="eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a" Apr 21 16:02:36.379125 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.379086 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx"] Apr 21 16:02:36.383262 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.383229 2569 scope.go:117] "RemoveContainer" containerID="c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024" Apr 21 16:02:36.385749 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.385718 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7bb5c5f6dc5czhx"] Apr 21 16:02:36.455459 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.455158 2569 scope.go:117] "RemoveContainer" containerID="eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a" Apr 21 16:02:36.455577 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:02:36.455495 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a\": container with ID starting with eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a not found: ID does not exist" containerID="eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a" Apr 21 16:02:36.455577 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.455539 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a"} err="failed to get container status \"eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a\": rpc error: code = NotFound desc = could not find container \"eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a\": container with ID starting with eaa1f1066f81336da63d8113855fc8bfc83f3eb706900fc3781671e0294e4c1a not found: ID does not exist" Apr 21 16:02:36.455577 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.455573 2569 scope.go:117] "RemoveContainer" containerID="c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024" Apr 21 16:02:36.455854 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:02:36.455827 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024\": container with ID starting with c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024 not found: ID does not exist" containerID="c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024" Apr 21 16:02:36.455950 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:36.455862 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024"} err="failed to get container status \"c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024\": rpc error: code = NotFound desc = could not find container \"c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024\": container with ID starting with c1d1115a39484f62cd8179f4a209c2ec2dd13071faf3da38b1c4ed185a7c3024 not found: ID does not exist" Apr 21 16:02:37.057018 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:37.056977 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" path="/var/lib/kubelet/pods/6cb3c30c-55b8-4b3e-9c1c-32d913925335/volumes" Apr 21 16:02:43.360793 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:43.360751 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:02:43.361354 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:43.361074 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:02:43.711890 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:43.711844 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:02:53.361049 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:53.361000 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:02:53.712314 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:02:53.712238 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:03:03.360901 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:03.360842 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:03:03.712321 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:03.712270 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:03:13.360368 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:13.360309 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:03:13.712538 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:13.712473 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:03:23.360495 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:23.360446 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:03:23.712589 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:23.712533 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:03:33.360710 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:33.360658 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:03:33.711716 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:33.711667 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:03:43.360916 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:43.360861 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:03:43.712049 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:43.712001 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:03:53.360327 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:53.360272 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:03:53.712344 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:03:53.712295 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:04:03.361049 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:03.360991 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:04:03.712086 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:03.712023 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:04:13.361492 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:13.361429 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:04:13.712726 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:13.712668 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8001/health\": dial tcp 10.133.0.56:8001: connect: connection refused" Apr 21 16:04:23.360371 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:23.360320 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 21 16:04:23.721629 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:23.721595 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:04:23.747434 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:23.747402 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:04:33.379350 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:33.379292 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:04:33.394737 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:33.394698 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:04:35.129572 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:35.129526 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk"] Apr 21 16:04:35.130075 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:35.129957 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" containerID="cri-o://b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba" gracePeriod=30 Apr 21 16:04:35.135697 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:35.135671 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq"] Apr 21 16:04:35.135986 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:35.135953 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerName="main" containerID="cri-o://df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1" gracePeriod=30 Apr 21 16:04:35.136119 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:35.135994 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerName="tokenizer" containerID="cri-o://3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96" gracePeriod=30 Apr 21 16:04:35.257984 ip-10-0-128-232 kubenswrapper[2569]: W0421 16:04:35.257958 2569 logging.go:55] [core] [Channel #342 SubChannel #343]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.57:9003", ServerName: "10.133.0.57:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.57:9003: connect: connection refused" Apr 21 16:04:35.859345 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:35.859294 2569 generic.go:358] "Generic (PLEG): container finished" podID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerID="df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1" exitCode=0 Apr 21 16:04:35.859572 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:35.859359 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" event={"ID":"cee1b71f-085f-4e69-bbef-816ce7e8fd8e","Type":"ContainerDied","Data":"df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1"} Apr 21 16:04:36.258390 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.258347 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.57:9003\" within 1s: context deadline exceeded" Apr 21 16:04:36.492366 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.492338 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:04:36.583167 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.583067 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kserve-provision-location\") pod \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " Apr 21 16:04:36.583167 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.583147 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tls-certs\") pod \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " Apr 21 16:04:36.583415 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.583202 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-uds\") pod \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " Apr 21 16:04:36.583415 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.583222 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tf5q\" (UniqueName: \"kubernetes.io/projected/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kube-api-access-7tf5q\") pod \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " Apr 21 16:04:36.583415 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.583244 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-cache\") pod \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " Apr 21 16:04:36.583415 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.583265 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-tmp\") pod \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\" (UID: \"cee1b71f-085f-4e69-bbef-816ce7e8fd8e\") " Apr 21 16:04:36.583649 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.583454 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "cee1b71f-085f-4e69-bbef-816ce7e8fd8e" (UID: "cee1b71f-085f-4e69-bbef-816ce7e8fd8e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:36.583649 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.583516 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "cee1b71f-085f-4e69-bbef-816ce7e8fd8e" (UID: "cee1b71f-085f-4e69-bbef-816ce7e8fd8e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:36.583649 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.583543 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:36.583802 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.583744 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "cee1b71f-085f-4e69-bbef-816ce7e8fd8e" (UID: "cee1b71f-085f-4e69-bbef-816ce7e8fd8e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:36.583853 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.583793 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cee1b71f-085f-4e69-bbef-816ce7e8fd8e" (UID: "cee1b71f-085f-4e69-bbef-816ce7e8fd8e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:36.585310 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.585290 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "cee1b71f-085f-4e69-bbef-816ce7e8fd8e" (UID: "cee1b71f-085f-4e69-bbef-816ce7e8fd8e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:04:36.585832 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.585811 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kube-api-access-7tf5q" (OuterVolumeSpecName: "kube-api-access-7tf5q") pod "cee1b71f-085f-4e69-bbef-816ce7e8fd8e" (UID: "cee1b71f-085f-4e69-bbef-816ce7e8fd8e"). InnerVolumeSpecName "kube-api-access-7tf5q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:04:36.684627 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.684592 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7tf5q\" (UniqueName: \"kubernetes.io/projected/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kube-api-access-7tf5q\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:36.684627 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.684623 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-tmp\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:36.684627 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.684634 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:36.684849 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.684643 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:36.684849 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.684652 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cee1b71f-085f-4e69-bbef-816ce7e8fd8e-tokenizer-uds\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:36.865425 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.865341 2569 generic.go:358] "Generic (PLEG): container finished" podID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerID="3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96" exitCode=0 Apr 21 16:04:36.865425 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.865384 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" event={"ID":"cee1b71f-085f-4e69-bbef-816ce7e8fd8e","Type":"ContainerDied","Data":"3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96"} Apr 21 16:04:36.865652 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.865428 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" event={"ID":"cee1b71f-085f-4e69-bbef-816ce7e8fd8e","Type":"ContainerDied","Data":"c2b2cb0a4bbef3641d0e283ce3c9a42f31d1f6ee23f5f22cabcf79aba4dd971e"} Apr 21 16:04:36.865652 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.865445 2569 scope.go:117] "RemoveContainer" containerID="3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96" Apr 21 16:04:36.865652 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.865447 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq" Apr 21 16:04:36.878649 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.878624 2569 scope.go:117] "RemoveContainer" containerID="df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1" Apr 21 16:04:36.887562 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.887546 2569 scope.go:117] "RemoveContainer" containerID="34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1" Apr 21 16:04:36.895446 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.895408 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq"] Apr 21 16:04:36.896510 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.896466 2569 scope.go:117] "RemoveContainer" containerID="3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96" Apr 21 16:04:36.896781 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:04:36.896760 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96\": container with ID starting with 3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96 not found: ID does not exist" containerID="3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96" Apr 21 16:04:36.896847 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.896789 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96"} err="failed to get container status \"3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96\": rpc error: code = NotFound desc = could not find container \"3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96\": container with ID starting with 3cdb3e5fe5e03c4b142d554bd0cb75aee846880915cc4a512408dea2fbe00d96 not found: ID does not exist" Apr 21 16:04:36.896847 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.896809 2569 scope.go:117] "RemoveContainer" containerID="df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1" Apr 21 16:04:36.897035 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:04:36.897020 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1\": container with ID starting with df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1 not found: ID does not exist" containerID="df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1" Apr 21 16:04:36.897076 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.897038 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1"} err="failed to get container status \"df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1\": rpc error: code = NotFound desc = could not find container \"df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1\": container with ID starting with df8ce049f75a4d2758036f590be49e8c1e19ae59b67ee69a17a3b62e6cc5c3e1 not found: ID does not exist" Apr 21 16:04:36.897076 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.897050 2569 scope.go:117] "RemoveContainer" containerID="34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1" Apr 21 16:04:36.897275 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:04:36.897256 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1\": container with ID starting with 34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1 not found: ID does not exist" containerID="34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1" Apr 21 16:04:36.897322 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.897281 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1"} err="failed to get container status \"34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1\": rpc error: code = NotFound desc = could not find container \"34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1\": container with ID starting with 34e6e32b3f49bb8c1dc5317734ef8c51ce06eb392c1cff19b9f0d150da5f69f1 not found: ID does not exist" Apr 21 16:04:36.899075 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:36.899053 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5c898kv6cq"] Apr 21 16:04:37.055126 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:37.055094 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" path="/var/lib/kubelet/pods/cee1b71f-085f-4e69-bbef-816ce7e8fd8e/volumes" Apr 21 16:04:46.895668 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.895630 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5"] Apr 21 16:04:46.896041 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.895992 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="storage-initializer" Apr 21 16:04:46.896041 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896005 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="storage-initializer" Apr 21 16:04:46.896041 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896015 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerName="main" Apr 21 16:04:46.896041 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896021 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerName="main" Apr 21 16:04:46.896041 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896027 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerName="tokenizer" Apr 21 16:04:46.896041 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896033 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerName="tokenizer" Apr 21 16:04:46.896041 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896043 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" Apr 21 16:04:46.896261 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896049 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" Apr 21 16:04:46.896261 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896056 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerName="storage-initializer" Apr 21 16:04:46.896261 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896061 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerName="storage-initializer" Apr 21 16:04:46.896261 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896126 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6cb3c30c-55b8-4b3e-9c1c-32d913925335" containerName="main" Apr 21 16:04:46.896261 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896133 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerName="main" Apr 21 16:04:46.896261 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.896140 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cee1b71f-085f-4e69-bbef-816ce7e8fd8e" containerName="tokenizer" Apr 21 16:04:46.899329 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.899307 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:46.901946 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.901920 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-d2f6d\"" Apr 21 16:04:46.902079 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.902060 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 21 16:04:46.911888 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.911863 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5"] Apr 21 16:04:46.965246 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.965210 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:46.965246 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.965251 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klbck\" (UniqueName: \"kubernetes.io/projected/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kube-api-access-klbck\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:46.965450 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.965317 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:46.965450 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.965337 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:46.965450 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.965368 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:46.965450 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:46.965383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.065699 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.065668 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klbck\" (UniqueName: \"kubernetes.io/projected/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kube-api-access-klbck\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.065875 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.065736 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.065875 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.065762 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.065875 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.065783 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.065875 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.065798 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.065875 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.065819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.066142 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.066113 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.066201 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.066172 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.066268 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.066239 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.066268 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.066247 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.068389 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.068367 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.075189 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.075153 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klbck\" (UniqueName: \"kubernetes.io/projected/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kube-api-access-klbck\") pod \"scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.211140 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.211098 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:47.340626 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.340597 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5"] Apr 21 16:04:47.342313 ip-10-0-128-232 kubenswrapper[2569]: W0421 16:04:47.342281 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5119a6c_0f05_4188_a554_dbb5f0819cfb.slice/crio-2f373da8e6400f95483f9485801d11b5f471587f2341bf966644b35b70926f9c WatchSource:0}: Error finding container 2f373da8e6400f95483f9485801d11b5f471587f2341bf966644b35b70926f9c: Status 404 returned error can't find the container with id 2f373da8e6400f95483f9485801d11b5f471587f2341bf966644b35b70926f9c Apr 21 16:04:47.917634 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.917596 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" event={"ID":"c5119a6c-0f05-4188-a554-dbb5f0819cfb","Type":"ContainerStarted","Data":"164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5"} Apr 21 16:04:47.918038 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:47.917642 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" event={"ID":"c5119a6c-0f05-4188-a554-dbb5f0819cfb","Type":"ContainerStarted","Data":"2f373da8e6400f95483f9485801d11b5f471587f2341bf966644b35b70926f9c"} Apr 21 16:04:48.250288 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:48.250244 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 21 16:04:48.250667 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:48.250636 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" containerID="cri-o://207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5" gracePeriod=30 Apr 21 16:04:48.922616 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:48.922581 2569 generic.go:358] "Generic (PLEG): container finished" podID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerID="164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5" exitCode=0 Apr 21 16:04:48.923011 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:48.922664 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" event={"ID":"c5119a6c-0f05-4188-a554-dbb5f0819cfb","Type":"ContainerDied","Data":"164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5"} Apr 21 16:04:49.108608 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.108582 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:04:49.181214 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.181129 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-model-cache\") pod \"f0ca46e5-975c-4918-af98-7fdb5d238330\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " Apr 21 16:04:49.181214 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.181169 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-kserve-provision-location\") pod \"f0ca46e5-975c-4918-af98-7fdb5d238330\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " Apr 21 16:04:49.181214 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.181187 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrv7r\" (UniqueName: \"kubernetes.io/projected/f0ca46e5-975c-4918-af98-7fdb5d238330-kube-api-access-qrv7r\") pod \"f0ca46e5-975c-4918-af98-7fdb5d238330\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " Apr 21 16:04:49.181527 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.181217 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f0ca46e5-975c-4918-af98-7fdb5d238330-tls-certs\") pod \"f0ca46e5-975c-4918-af98-7fdb5d238330\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " Apr 21 16:04:49.181527 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.181251 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-home\") pod \"f0ca46e5-975c-4918-af98-7fdb5d238330\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " Apr 21 16:04:49.181527 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.181294 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-dshm\") pod \"f0ca46e5-975c-4918-af98-7fdb5d238330\" (UID: \"f0ca46e5-975c-4918-af98-7fdb5d238330\") " Apr 21 16:04:49.181527 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.181406 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-model-cache" (OuterVolumeSpecName: "model-cache") pod "f0ca46e5-975c-4918-af98-7fdb5d238330" (UID: "f0ca46e5-975c-4918-af98-7fdb5d238330"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:49.181741 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.181598 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-model-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.181741 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.181641 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-home" (OuterVolumeSpecName: "home") pod "f0ca46e5-975c-4918-af98-7fdb5d238330" (UID: "f0ca46e5-975c-4918-af98-7fdb5d238330"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:49.183973 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.183944 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-dshm" (OuterVolumeSpecName: "dshm") pod "f0ca46e5-975c-4918-af98-7fdb5d238330" (UID: "f0ca46e5-975c-4918-af98-7fdb5d238330"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:49.184115 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.184002 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ca46e5-975c-4918-af98-7fdb5d238330-kube-api-access-qrv7r" (OuterVolumeSpecName: "kube-api-access-qrv7r") pod "f0ca46e5-975c-4918-af98-7fdb5d238330" (UID: "f0ca46e5-975c-4918-af98-7fdb5d238330"). InnerVolumeSpecName "kube-api-access-qrv7r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:04:49.184220 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.184196 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ca46e5-975c-4918-af98-7fdb5d238330-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f0ca46e5-975c-4918-af98-7fdb5d238330" (UID: "f0ca46e5-975c-4918-af98-7fdb5d238330"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:04:49.244853 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.244814 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f0ca46e5-975c-4918-af98-7fdb5d238330" (UID: "f0ca46e5-975c-4918-af98-7fdb5d238330"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:49.282177 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.282137 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.282177 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.282179 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrv7r\" (UniqueName: \"kubernetes.io/projected/f0ca46e5-975c-4918-af98-7fdb5d238330-kube-api-access-qrv7r\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.282369 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.282197 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f0ca46e5-975c-4918-af98-7fdb5d238330-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.282369 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.282212 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-home\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.282369 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.282225 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f0ca46e5-975c-4918-af98-7fdb5d238330-dshm\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.928711 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.928675 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" event={"ID":"c5119a6c-0f05-4188-a554-dbb5f0819cfb","Type":"ContainerStarted","Data":"4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d"} Apr 21 16:04:49.929145 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.928718 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" event={"ID":"c5119a6c-0f05-4188-a554-dbb5f0819cfb","Type":"ContainerStarted","Data":"2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178"} Apr 21 16:04:49.929145 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.928822 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:49.930079 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.930056 2569 generic.go:358] "Generic (PLEG): container finished" podID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerID="207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5" exitCode=0 Apr 21 16:04:49.930191 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.930088 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f0ca46e5-975c-4918-af98-7fdb5d238330","Type":"ContainerDied","Data":"207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5"} Apr 21 16:04:49.930191 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.930107 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f0ca46e5-975c-4918-af98-7fdb5d238330","Type":"ContainerDied","Data":"09463c4539fe02f6dec0e90e8c9b846b045fe786a8c739cce2b30a3f4ca60c79"} Apr 21 16:04:49.930191 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.930121 2569 scope.go:117] "RemoveContainer" containerID="207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5" Apr 21 16:04:49.930191 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.930146 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 21 16:04:49.952521 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.952467 2569 scope.go:117] "RemoveContainer" containerID="8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb" Apr 21 16:04:49.954762 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.954710 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" podStartSLOduration=3.954693453 podStartE2EDuration="3.954693453s" podCreationTimestamp="2026-04-21 16:04:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:04:49.951908443 +0000 UTC m=+1765.523941388" watchObservedRunningTime="2026-04-21 16:04:49.954693453 +0000 UTC m=+1765.526726399" Apr 21 16:04:49.967710 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.967632 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 21 16:04:49.971768 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:49.971742 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 21 16:04:50.011749 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:50.011725 2569 scope.go:117] "RemoveContainer" containerID="207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5" Apr 21 16:04:50.012051 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:04:50.012031 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5\": container with ID starting with 207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5 not found: ID does not exist" containerID="207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5" Apr 21 16:04:50.012101 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:50.012063 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5"} err="failed to get container status \"207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5\": rpc error: code = NotFound desc = could not find container \"207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5\": container with ID starting with 207c39bb95e05f0ca45c27930a58c5a9b398f7bdff0bcfe9479993ce3b1ca7e5 not found: ID does not exist" Apr 21 16:04:50.012101 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:50.012082 2569 scope.go:117] "RemoveContainer" containerID="8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb" Apr 21 16:04:50.012386 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:04:50.012369 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb\": container with ID starting with 8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb not found: ID does not exist" containerID="8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb" Apr 21 16:04:50.012426 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:50.012390 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb"} err="failed to get container status \"8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb\": rpc error: code = NotFound desc = could not find container \"8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb\": container with ID starting with 8e57c166d97071d14b2fd8af32fc4f232dfd1c45ded2c71a98ae927dceac39eb not found: ID does not exist" Apr 21 16:04:51.054609 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:51.054575 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" path="/var/lib/kubelet/pods/f0ca46e5-975c-4918-af98-7fdb5d238330/volumes" Apr 21 16:04:57.211544 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:57.211506 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:57.212000 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:57.211563 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:57.214188 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:57.214162 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:04:57.966350 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:04:57.966317 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:05:05.130992 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.130952 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="llm-d-routing-sidecar" containerID="cri-o://98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153" gracePeriod=2 Apr 21 16:05:05.373741 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.373708 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk_c7f5d183-76fb-4e4a-aed3-cdbb755398c5/main/0.log" Apr 21 16:05:05.374372 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.374351 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:05:05.428565 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.428531 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrj6j\" (UniqueName: \"kubernetes.io/projected/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kube-api-access-vrj6j\") pod \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " Apr 21 16:05:05.428737 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.428589 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-dshm\") pod \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " Apr 21 16:05:05.428783 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.428732 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-home\") pod \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " Apr 21 16:05:05.428783 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.428764 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kserve-provision-location\") pod \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " Apr 21 16:05:05.428886 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.428786 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-tls-certs\") pod \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " Apr 21 16:05:05.428886 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.428829 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-model-cache\") pod \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\" (UID: \"c7f5d183-76fb-4e4a-aed3-cdbb755398c5\") " Apr 21 16:05:05.429246 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.429198 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-model-cache" (OuterVolumeSpecName: "model-cache") pod "c7f5d183-76fb-4e4a-aed3-cdbb755398c5" (UID: "c7f5d183-76fb-4e4a-aed3-cdbb755398c5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:05.429246 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.429222 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-home" (OuterVolumeSpecName: "home") pod "c7f5d183-76fb-4e4a-aed3-cdbb755398c5" (UID: "c7f5d183-76fb-4e4a-aed3-cdbb755398c5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:05.430839 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.430814 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-dshm" (OuterVolumeSpecName: "dshm") pod "c7f5d183-76fb-4e4a-aed3-cdbb755398c5" (UID: "c7f5d183-76fb-4e4a-aed3-cdbb755398c5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:05.430958 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.430850 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kube-api-access-vrj6j" (OuterVolumeSpecName: "kube-api-access-vrj6j") pod "c7f5d183-76fb-4e4a-aed3-cdbb755398c5" (UID: "c7f5d183-76fb-4e4a-aed3-cdbb755398c5"). InnerVolumeSpecName "kube-api-access-vrj6j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:05:05.430958 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.430935 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c7f5d183-76fb-4e4a-aed3-cdbb755398c5" (UID: "c7f5d183-76fb-4e4a-aed3-cdbb755398c5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:05.479574 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.479515 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7f5d183-76fb-4e4a-aed3-cdbb755398c5" (UID: "c7f5d183-76fb-4e4a-aed3-cdbb755398c5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:05.530525 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.530448 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vrj6j\" (UniqueName: \"kubernetes.io/projected/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kube-api-access-vrj6j\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:05.530525 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.530525 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-dshm\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:05.530525 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.530541 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-home\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:05.530777 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.530551 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:05.530777 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.530561 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:05.530777 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:05.530571 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c7f5d183-76fb-4e4a-aed3-cdbb755398c5-model-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:06.001356 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.001326 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk_c7f5d183-76fb-4e4a-aed3-cdbb755398c5/main/0.log" Apr 21 16:05:06.002028 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.001998 2569 generic.go:358] "Generic (PLEG): container finished" podID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerID="b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba" exitCode=137 Apr 21 16:05:06.002104 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.002029 2569 generic.go:358] "Generic (PLEG): container finished" podID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerID="98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153" exitCode=0 Apr 21 16:05:06.002104 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.002075 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" event={"ID":"c7f5d183-76fb-4e4a-aed3-cdbb755398c5","Type":"ContainerDied","Data":"b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba"} Apr 21 16:05:06.002182 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.002114 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" Apr 21 16:05:06.002182 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.002129 2569 scope.go:117] "RemoveContainer" containerID="b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba" Apr 21 16:05:06.002286 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.002117 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" event={"ID":"c7f5d183-76fb-4e4a-aed3-cdbb755398c5","Type":"ContainerDied","Data":"98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153"} Apr 21 16:05:06.002286 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.002278 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk" event={"ID":"c7f5d183-76fb-4e4a-aed3-cdbb755398c5","Type":"ContainerDied","Data":"2da6750226ea400f76e62e81deec5f4b04a1305acb0c0c2e1bc1286d34bed750"} Apr 21 16:05:06.024398 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.024373 2569 scope.go:117] "RemoveContainer" containerID="cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9" Apr 21 16:05:06.026212 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.026184 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk"] Apr 21 16:05:06.029652 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.029628 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-75f5bfbf85-nz7gk"] Apr 21 16:05:06.084469 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.084448 2569 scope.go:117] "RemoveContainer" containerID="98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153" Apr 21 16:05:06.092561 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.092541 2569 scope.go:117] "RemoveContainer" containerID="b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba" Apr 21 16:05:06.092811 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:05:06.092792 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba\": container with ID starting with b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba not found: ID does not exist" containerID="b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba" Apr 21 16:05:06.092877 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.092821 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba"} err="failed to get container status \"b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba\": rpc error: code = NotFound desc = could not find container \"b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba\": container with ID starting with b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba not found: ID does not exist" Apr 21 16:05:06.092877 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.092841 2569 scope.go:117] "RemoveContainer" containerID="cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9" Apr 21 16:05:06.093091 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:05:06.093060 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9\": container with ID starting with cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9 not found: ID does not exist" containerID="cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9" Apr 21 16:05:06.093162 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.093103 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9"} err="failed to get container status \"cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9\": rpc error: code = NotFound desc = could not find container \"cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9\": container with ID starting with cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9 not found: ID does not exist" Apr 21 16:05:06.093162 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.093130 2569 scope.go:117] "RemoveContainer" containerID="98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153" Apr 21 16:05:06.093413 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:05:06.093398 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153\": container with ID starting with 98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153 not found: ID does not exist" containerID="98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153" Apr 21 16:05:06.093470 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.093419 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153"} err="failed to get container status \"98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153\": rpc error: code = NotFound desc = could not find container \"98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153\": container with ID starting with 98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153 not found: ID does not exist" Apr 21 16:05:06.093470 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.093437 2569 scope.go:117] "RemoveContainer" containerID="b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba" Apr 21 16:05:06.093678 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.093658 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba"} err="failed to get container status \"b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba\": rpc error: code = NotFound desc = could not find container \"b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba\": container with ID starting with b1203070dbb59e8702d690c36cf6d021935cb587e32d3891635ed18811ea86ba not found: ID does not exist" Apr 21 16:05:06.093721 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.093679 2569 scope.go:117] "RemoveContainer" containerID="cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9" Apr 21 16:05:06.093903 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.093882 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9"} err="failed to get container status \"cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9\": rpc error: code = NotFound desc = could not find container \"cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9\": container with ID starting with cb002052073724a8f94a335997847b206eb266de5211895a04b8a62c7f1189f9 not found: ID does not exist" Apr 21 16:05:06.093948 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.093906 2569 scope.go:117] "RemoveContainer" containerID="98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153" Apr 21 16:05:06.094096 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:06.094080 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153"} err="failed to get container status \"98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153\": rpc error: code = NotFound desc = could not find container \"98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153\": container with ID starting with 98836aef681f4138c8411e700f697879f65d2389b4680d9687ed2906546c5153 not found: ID does not exist" Apr 21 16:05:07.054145 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:07.054102 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" path="/var/lib/kubelet/pods/c7f5d183-76fb-4e4a-aed3-cdbb755398c5/volumes" Apr 21 16:05:25.090127 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:25.090099 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 16:05:25.092929 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:25.092898 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 16:05:28.970164 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:28.970132 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:05:30.344241 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:30.344212 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5"] Apr 21 16:05:30.344696 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:30.344493 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" podUID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerName="main" containerID="cri-o://2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178" gracePeriod=30 Apr 21 16:05:30.344696 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:30.344552 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" podUID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerName="tokenizer" containerID="cri-o://4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d" gracePeriod=30 Apr 21 16:05:31.095734 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.095698 2569 generic.go:358] "Generic (PLEG): container finished" podID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerID="2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178" exitCode=0 Apr 21 16:05:31.095916 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.095761 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" event={"ID":"c5119a6c-0f05-4188-a554-dbb5f0819cfb","Type":"ContainerDied","Data":"2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178"} Apr 21 16:05:31.603588 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.603559 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:05:31.747677 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.747641 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-cache\") pod \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " Apr 21 16:05:31.747677 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.747680 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tls-certs\") pod \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " Apr 21 16:05:31.747899 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.747708 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-uds\") pod \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " Apr 21 16:05:31.747899 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.747738 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kserve-provision-location\") pod \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " Apr 21 16:05:31.747899 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.747852 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klbck\" (UniqueName: \"kubernetes.io/projected/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kube-api-access-klbck\") pod \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " Apr 21 16:05:31.747899 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.747893 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-tmp\") pod \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\" (UID: \"c5119a6c-0f05-4188-a554-dbb5f0819cfb\") " Apr 21 16:05:31.748087 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.747983 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c5119a6c-0f05-4188-a554-dbb5f0819cfb" (UID: "c5119a6c-0f05-4188-a554-dbb5f0819cfb"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:31.748087 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.747989 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c5119a6c-0f05-4188-a554-dbb5f0819cfb" (UID: "c5119a6c-0f05-4188-a554-dbb5f0819cfb"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:31.748225 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.748206 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-cache\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:31.748267 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.748230 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-uds\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:31.748301 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.748264 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c5119a6c-0f05-4188-a554-dbb5f0819cfb" (UID: "c5119a6c-0f05-4188-a554-dbb5f0819cfb"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:31.748467 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.748449 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c5119a6c-0f05-4188-a554-dbb5f0819cfb" (UID: "c5119a6c-0f05-4188-a554-dbb5f0819cfb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:31.749890 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.749859 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c5119a6c-0f05-4188-a554-dbb5f0819cfb" (UID: "c5119a6c-0f05-4188-a554-dbb5f0819cfb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:31.749890 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.749867 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kube-api-access-klbck" (OuterVolumeSpecName: "kube-api-access-klbck") pod "c5119a6c-0f05-4188-a554-dbb5f0819cfb" (UID: "c5119a6c-0f05-4188-a554-dbb5f0819cfb"). InnerVolumeSpecName "kube-api-access-klbck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:05:31.849267 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.849222 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kserve-provision-location\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:31.849267 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.849262 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-klbck\" (UniqueName: \"kubernetes.io/projected/c5119a6c-0f05-4188-a554-dbb5f0819cfb-kube-api-access-klbck\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:31.849267 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.849274 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tokenizer-tmp\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:31.849267 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:31.849283 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5119a6c-0f05-4188-a554-dbb5f0819cfb-tls-certs\") on node \"ip-10-0-128-232.ec2.internal\" DevicePath \"\"" Apr 21 16:05:32.101930 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.101848 2569 generic.go:358] "Generic (PLEG): container finished" podID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerID="4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d" exitCode=0 Apr 21 16:05:32.101930 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.101887 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" event={"ID":"c5119a6c-0f05-4188-a554-dbb5f0819cfb","Type":"ContainerDied","Data":"4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d"} Apr 21 16:05:32.101930 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.101916 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" event={"ID":"c5119a6c-0f05-4188-a554-dbb5f0819cfb","Type":"ContainerDied","Data":"2f373da8e6400f95483f9485801d11b5f471587f2341bf966644b35b70926f9c"} Apr 21 16:05:32.101930 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.101922 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5" Apr 21 16:05:32.102206 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.101935 2569 scope.go:117] "RemoveContainer" containerID="4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d" Apr 21 16:05:32.111347 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.111329 2569 scope.go:117] "RemoveContainer" containerID="2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178" Apr 21 16:05:32.118784 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.118763 2569 scope.go:117] "RemoveContainer" containerID="164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5" Apr 21 16:05:32.124915 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.124892 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5"] Apr 21 16:05:32.127264 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.127240 2569 scope.go:117] "RemoveContainer" containerID="4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d" Apr 21 16:05:32.127599 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:05:32.127562 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d\": container with ID starting with 4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d not found: ID does not exist" containerID="4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d" Apr 21 16:05:32.127709 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.127604 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d"} err="failed to get container status \"4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d\": rpc error: code = NotFound desc = could not find container \"4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d\": container with ID starting with 4c623d1b0653e8e5e0a153dd9318ddd004c9966ca64d213b80ddc1690e438f9d not found: ID does not exist" Apr 21 16:05:32.127709 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.127629 2569 scope.go:117] "RemoveContainer" containerID="2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178" Apr 21 16:05:32.127901 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:05:32.127882 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178\": container with ID starting with 2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178 not found: ID does not exist" containerID="2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178" Apr 21 16:05:32.127942 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.127907 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178"} err="failed to get container status \"2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178\": rpc error: code = NotFound desc = could not find container \"2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178\": container with ID starting with 2106221257feb31472cfd8aac728f93441ad9783ed39cf61c90961f0cd175178 not found: ID does not exist" Apr 21 16:05:32.127942 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.127924 2569 scope.go:117] "RemoveContainer" containerID="164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5" Apr 21 16:05:32.128176 ip-10-0-128-232 kubenswrapper[2569]: E0421 16:05:32.128158 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5\": container with ID starting with 164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5 not found: ID does not exist" containerID="164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5" Apr 21 16:05:32.128233 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.128186 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5"} err="failed to get container status \"164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5\": rpc error: code = NotFound desc = could not find container \"164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5\": container with ID starting with 164e85c09b56e61ecbb44abbc451662add4d31dfbc6fab552ce19fd2d1cd62e5 not found: ID does not exist" Apr 21 16:05:32.128993 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:32.128976 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-674c9m7sh5"] Apr 21 16:05:33.055208 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:33.055172 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" path="/var/lib/kubelet/pods/c5119a6c-0f05-4188-a554-dbb5f0819cfb/volumes" Apr 21 16:05:58.428776 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:58.428748 2569 ???:1] "http: TLS handshake error from 10.0.128.232:55602: EOF" Apr 21 16:05:58.429914 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:58.429889 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-744b6c97cd-ktnzw_e601bc85-e6ec-4e42-8728-90ec8be4699c/router/0.log" Apr 21 16:05:59.293817 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:05:59.293784 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-744b6c97cd-ktnzw_e601bc85-e6ec-4e42-8728-90ec8be4699c/router/0.log" Apr 21 16:06:00.107812 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:00.107779 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-4snmx_f2e1d285-ef28-46cc-bdca-484abff6dcb4/authorino/0.log" Apr 21 16:06:00.148401 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:00.148370 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-nx62s_3060312c-c108-46b7-b499-e2fe4957e773/kuadrant-console-plugin/0.log" Apr 21 16:06:05.639849 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:05.639820 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4mz5z_324dbbf6-3ab0-424a-a580-24aa71c13cb6/global-pull-secret-syncer/0.log" Apr 21 16:06:05.737262 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:05.737229 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6l9p8_822fd92b-9cc2-44e7-972a-9b68cde8ab9a/konnectivity-agent/0.log" Apr 21 16:06:05.823924 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:05.823894 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-232.ec2.internal_fca69f1fb3124857e02406d9db421c8d/haproxy/0.log" Apr 21 16:06:09.687217 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:09.687189 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-4snmx_f2e1d285-ef28-46cc-bdca-484abff6dcb4/authorino/0.log" Apr 21 16:06:09.782610 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:09.782581 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-nx62s_3060312c-c108-46b7-b499-e2fe4957e773/kuadrant-console-plugin/0.log" Apr 21 16:06:11.493493 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:11.493437 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s64zs_cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c/node-exporter/0.log" Apr 21 16:06:11.517090 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:11.517062 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s64zs_cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c/kube-rbac-proxy/0.log" Apr 21 16:06:11.544592 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:11.544566 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s64zs_cac7a4e2-e818-4cbc-9a4d-c5bcd0f4340c/init-textfile/0.log" Apr 21 16:06:11.649837 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:11.649805 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-s9g9g_f42786e0-4a65-43c9-a735-126e01bd577a/kube-rbac-proxy-main/0.log" Apr 21 16:06:11.674040 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:11.674009 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-s9g9g_f42786e0-4a65-43c9-a735-126e01bd577a/kube-rbac-proxy-self/0.log" Apr 21 16:06:11.720342 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:11.720305 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-s9g9g_f42786e0-4a65-43c9-a735-126e01bd577a/openshift-state-metrics/0.log" Apr 21 16:06:11.956365 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:11.956338 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rmh8w_110cb247-6c78-4666-aba6-4a6ac658c728/prometheus-operator/0.log" Apr 21 16:06:11.976023 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:11.975994 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rmh8w_110cb247-6c78-4666-aba6-4a6ac658c728/kube-rbac-proxy/0.log" Apr 21 16:06:11.998700 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:11.998672 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-522f8_1e991f09-4d79-4d2a-9195-adc43e8fcfc4/prometheus-operator-admission-webhook/0.log" Apr 21 16:06:13.336793 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:13.336762 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-qvkhd_54dec317-f338-45be-84df-5116ae87636c/networking-console-plugin/0.log" Apr 21 16:06:14.342772 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.342744 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fb57bc5dc-rdphb_ad438104-e8e8-4a3d-b4b1-086c7fd193a6/console/0.log" Apr 21 16:06:14.377153 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.377121 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-7vrxq_17c163e7-de47-4efa-bea9-78d232508160/download-server/0.log" Apr 21 16:06:14.553580 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553550 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94"] Apr 21 16:06:14.553884 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553872 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerName="main" Apr 21 16:06:14.553940 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553886 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerName="main" Apr 21 16:06:14.553940 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553895 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="storage-initializer" Apr 21 16:06:14.553940 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553901 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="storage-initializer" Apr 21 16:06:14.553940 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553915 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" Apr 21 16:06:14.553940 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553920 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" Apr 21 16:06:14.553940 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553928 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerName="tokenizer" Apr 21 16:06:14.553940 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553933 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerName="tokenizer" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553942 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="llm-d-routing-sidecar" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553948 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="llm-d-routing-sidecar" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553954 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553961 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553966 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="storage-initializer" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553972 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="storage-initializer" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553978 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerName="storage-initializer" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.553983 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerName="storage-initializer" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.554033 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0ca46e5-975c-4918-af98-7fdb5d238330" containerName="main" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.554041 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="main" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.554048 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7f5d183-76fb-4e4a-aed3-cdbb755398c5" containerName="llm-d-routing-sidecar" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.554053 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerName="main" Apr 21 16:06:14.554249 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.554061 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5119a6c-0f05-4188-a554-dbb5f0819cfb" containerName="tokenizer" Apr 21 16:06:14.557136 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.557116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.561256 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.561218 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dt7xd\"/\"default-dockercfg-k8dzq\"" Apr 21 16:06:14.561424 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.561408 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dt7xd\"/\"openshift-service-ca.crt\"" Apr 21 16:06:14.562170 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.562155 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dt7xd\"/\"kube-root-ca.crt\"" Apr 21 16:06:14.569115 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.569091 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94"] Apr 21 16:06:14.605379 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.605296 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-podres\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.605379 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.605338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-lib-modules\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.605590 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.605412 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-sys\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.605590 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.605460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-proc\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.605590 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.605518 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c847s\" (UniqueName: \"kubernetes.io/projected/52f12a11-3f8c-42c6-9305-7d2fee012bae-kube-api-access-c847s\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.706976 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.706928 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-sys\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.707240 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.706985 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-proc\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.707240 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.707026 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c847s\" (UniqueName: \"kubernetes.io/projected/52f12a11-3f8c-42c6-9305-7d2fee012bae-kube-api-access-c847s\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.707240 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.707066 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-sys\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.707240 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.707080 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-podres\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.707240 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.707125 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-proc\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.707240 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.707139 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-lib-modules\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.707240 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.707189 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-podres\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.707523 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.707258 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52f12a11-3f8c-42c6-9305-7d2fee012bae-lib-modules\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.715789 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.715767 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c847s\" (UniqueName: \"kubernetes.io/projected/52f12a11-3f8c-42c6-9305-7d2fee012bae-kube-api-access-c847s\") pod \"perf-node-gather-daemonset-btj94\" (UID: \"52f12a11-3f8c-42c6-9305-7d2fee012bae\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.866871 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.866782 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:14.997832 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:14.997805 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94"] Apr 21 16:06:14.999941 ip-10-0-128-232 kubenswrapper[2569]: W0421 16:06:14.999910 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod52f12a11_3f8c_42c6_9305_7d2fee012bae.slice/crio-679edaa0258041b3ebd8adf38a172df8e51b76b2c69639e4c154779f0c6a62f5 WatchSource:0}: Error finding container 679edaa0258041b3ebd8adf38a172df8e51b76b2c69639e4c154779f0c6a62f5: Status 404 returned error can't find the container with id 679edaa0258041b3ebd8adf38a172df8e51b76b2c69639e4c154779f0c6a62f5 Apr 21 16:06:15.267303 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:15.267264 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" event={"ID":"52f12a11-3f8c-42c6-9305-7d2fee012bae","Type":"ContainerStarted","Data":"889de4a22a89013f7aad477681eed2c82b864926f19ee2bf9bab1f0872db4d4d"} Apr 21 16:06:15.267303 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:15.267305 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" event={"ID":"52f12a11-3f8c-42c6-9305-7d2fee012bae","Type":"ContainerStarted","Data":"679edaa0258041b3ebd8adf38a172df8e51b76b2c69639e4c154779f0c6a62f5"} Apr 21 16:06:15.267597 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:15.267387 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:15.285724 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:15.285675 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" podStartSLOduration=1.2856600710000001 podStartE2EDuration="1.285660071s" podCreationTimestamp="2026-04-21 16:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:06:15.283904988 +0000 UTC m=+1850.855937932" watchObservedRunningTime="2026-04-21 16:06:15.285660071 +0000 UTC m=+1850.857693014" Apr 21 16:06:15.578888 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:15.578805 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gdwg8_cb43fce4-df4e-4cca-a455-90d323512faf/dns/0.log" Apr 21 16:06:15.599725 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:15.599693 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gdwg8_cb43fce4-df4e-4cca-a455-90d323512faf/kube-rbac-proxy/0.log" Apr 21 16:06:15.693985 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:15.693958 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9b9sq_50b93a7e-ace3-4153-b2d3-ea527a654b34/dns-node-resolver/0.log" Apr 21 16:06:16.182908 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:16.182869 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-74dfb5f878-97xdw_23290611-2011-4b35-851e-ccedc88a9391/registry/0.log" Apr 21 16:06:16.226790 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:16.226759 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jddzn_de4d7367-97b7-475a-b70f-d1b1f47d5106/node-ca/0.log" Apr 21 16:06:17.210112 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:17.210061 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-744b6c97cd-ktnzw_e601bc85-e6ec-4e42-8728-90ec8be4699c/router/0.log" Apr 21 16:06:17.647819 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:17.647728 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jpjds_fffa5175-92d5-48ec-a153-baf3f061b044/serve-healthcheck-canary/0.log" Apr 21 16:06:18.148045 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:18.148006 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-mbdj9_9b8e8467-e89b-4cd1-b772-d34e71416962/insights-operator/0.log" Apr 21 16:06:18.150187 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:18.150156 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-mbdj9_9b8e8467-e89b-4cd1-b772-d34e71416962/insights-operator/1.log" Apr 21 16:06:18.239014 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:18.238982 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hpj5v_999507b2-ef4d-48d7-acfe-a00e4f249573/kube-rbac-proxy/0.log" Apr 21 16:06:18.259568 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:18.259534 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hpj5v_999507b2-ef4d-48d7-acfe-a00e4f249573/exporter/0.log" Apr 21 16:06:18.280707 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:18.280677 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hpj5v_999507b2-ef4d-48d7-acfe-a00e4f249573/extractor/0.log" Apr 21 16:06:20.819446 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:20.819411 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-fb974466f-4bczq_831e2eae-f4ad-4a65-b953-17c2024d94c1/manager/0.log" Apr 21 16:06:21.281579 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:21.281539 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-btj94" Apr 21 16:06:21.454460 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:21.454422 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-6bf4699d45-xvlhs_ba9f8bad-3d2d-4707-b90e-c49c16cd94bc/manager/0.log" Apr 21 16:06:26.630128 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:26.630039 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qdkns_f4e9ee47-7719-418f-90bf-ada2af6eab08/kube-storage-version-migrator-operator/1.log" Apr 21 16:06:26.632472 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:26.632450 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qdkns_f4e9ee47-7719-418f-90bf-ada2af6eab08/kube-storage-version-migrator-operator/0.log" Apr 21 16:06:27.904551 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:27.904520 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwpr7_402ecfa4-798f-4e6f-9d15-5c6ef953439a/kube-multus-additional-cni-plugins/0.log" Apr 21 16:06:27.927116 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:27.927093 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwpr7_402ecfa4-798f-4e6f-9d15-5c6ef953439a/egress-router-binary-copy/0.log" Apr 21 16:06:27.947869 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:27.947842 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwpr7_402ecfa4-798f-4e6f-9d15-5c6ef953439a/cni-plugins/0.log" Apr 21 16:06:27.969244 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:27.969225 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwpr7_402ecfa4-798f-4e6f-9d15-5c6ef953439a/bond-cni-plugin/0.log" Apr 21 16:06:27.990694 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:27.990675 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwpr7_402ecfa4-798f-4e6f-9d15-5c6ef953439a/routeoverride-cni/0.log" Apr 21 16:06:28.042130 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:28.042106 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwpr7_402ecfa4-798f-4e6f-9d15-5c6ef953439a/whereabouts-cni-bincopy/0.log" Apr 21 16:06:28.059772 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:28.059745 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwpr7_402ecfa4-798f-4e6f-9d15-5c6ef953439a/whereabouts-cni/0.log" Apr 21 16:06:28.102242 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:28.102213 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cx47v_c2d06a3a-0637-4a19-b2ba-af896d234845/kube-multus/0.log" Apr 21 16:06:28.240098 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:28.240070 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lp64c_893ee07d-ac5e-4593-93fd-80655b690072/network-metrics-daemon/0.log" Apr 21 16:06:28.258755 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:28.258722 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lp64c_893ee07d-ac5e-4593-93fd-80655b690072/kube-rbac-proxy/0.log" Apr 21 16:06:29.728352 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:29.728320 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-controller/0.log" Apr 21 16:06:29.743912 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:29.743887 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/0.log" Apr 21 16:06:29.760404 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:29.760359 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovn-acl-logging/1.log" Apr 21 16:06:29.785693 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:29.785665 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/kube-rbac-proxy-node/0.log" Apr 21 16:06:29.806910 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:29.806885 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 16:06:29.822501 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:29.822465 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/northd/0.log" Apr 21 16:06:29.842768 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:29.842741 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/nbdb/0.log" Apr 21 16:06:29.862420 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:29.862395 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/sbdb/0.log" Apr 21 16:06:30.034310 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:30.034208 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wkxqg_7b0083d8-b152-40fa-9a89-e3180ed1747d/ovnkube-controller/0.log" Apr 21 16:06:31.115533 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:31.115498 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-9nzlb_c18e8ca8-7b38-4eb1-8ec2-a2817736c6c4/check-endpoints/0.log" Apr 21 16:06:31.137214 ip-10-0-128-232 kubenswrapper[2569]: I0421 16:06:31.137180 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ct78s_bbf400a9-66da-48f5-ba51-6ecd75c50fa2/network-check-target-container/0.log"